108 resultados para Heart rate variability (HRV)
Resumo:
This study examined physiological and performance effects of pre-cooling on medium-fast bowling in the heat. Ten, medium-fast bowlers completed two randomised trials involving either cooling (mixed-methods) or control (no cooling) interventions before a 6-over bowling spell in 31.9±2.1°C and 63.5±9.3% relative humidity. Measures included bowling performance (ball speed, accuracy and run-up speeds), physical characteristics (global positioning system monitoring and counter-movement jump height), physiological (heart rate, core temperature, skin temperature and sweat loss), biochemical (serum concentrations of damage, stress and inflammation) and perceptual variables (perceived exertion and thermal sensation). Mean ball speed (114.5±7.1 vs. 114.1±7.2 km · h−1; P = 0.63; d = 0.09), accuracy (43.1±10.6 vs. 44.2±12.5 AU; P = 0.76; d = 0.14) and total run-up speed (19.1±4.1 vs. 19.3±3.8 km · h−1; P = 0.66; d = 0.06) did not differ between pre-cooling and control respectively; however 20-m sprint speed between overs was 5.9±7.3% greater at Over 4 after pre-cooling (P = 0.03; d = 0.75). Pre-cooling reduced skin temperature after the intervention period (P = 0.006; d = 2.28), core temperature and pre-over heart rates throughout (P = 0.01−0.04; d = 0.96−1.74) and sweat loss by 0.4±0.3 kg (P = 0.01; d = 0.34). Mean rating of perceived exertion and thermal sensation were lower during pre-cooling trials (P = 0.004−0.03; d = 0.77−3.13). Despite no observed improvement in bowling performance, pre-cooling maintained between-over sprint speeds and blunted physiological and perceptual demands to ease the thermoregulatory demands of medium-fast bowling in hot conditions.
Resumo:
This investigation examined physiological and performance effects of cooling on recovery of medium-fast bowlers in the heat. Eight, medium-fast bowlers completed two randomised trials, involving two sessions completed on consecutive days (Session 1: 10-overs and Session 2: 4-overs) in 31 ± 3°C and 55 ± 17% relative humidity. Recovery interventions were administered for 20 min (mixed-method cooling vs. control) after Session 1. Measures included bowling performance (ball speed, accuracy, run-up speeds), physical demands (global positioning system, counter-movement jump), physiological (heart rate, core temperature, skin temperature, sweat loss), biochemical (creatine kinase, C-reactive protein) and perceptual variables (perceived exertion, thermal sensation, muscle soreness). Mean ball speed was higher after cooling in Session 2 (118.9 ± 8.1 vs. 115.5 ± 8.6 km · h−1; P = 0.001; d = 0.67), reducing declines in ball speed between sessions (0.24 vs. −3.18 km · h−1; P = 0.03; d = 1.80). Large effects indicated higher accuracy in Session 2 after cooling (46.0 ± 11.2 vs. 39.4 ± 8.6 arbitrary units [AU]; P = 0.13; d = 0.93) without affecting total run-up speed (19.0 ± 3.1 vs. 19.0 ± 2.5 km · h−1; P = 0.97; d = 0.01). Cooling reduced core temperature, skin temperature and thermal sensation throughout the intervention (P = 0.001–0.05; d = 1.31–5.78) and attenuated creatine kinase (P = 0.04; d = 0.56) and muscle soreness at 24-h (P = 0.03; d = 2.05). Accordingly, mixed-method cooling can reduce thermal strain after a 10-over spell and improve markers of muscular damage and discomfort alongside maintained medium-fast bowling performance on consecutive days in hot conditions.
Resumo:
The aim of this study was to investigate the effect of court surface (clay v hard-court) on technical, physiological and perceptual responses to on-court training. Four high-performance junior male players performed two identical training sessions on hard and clay courts, respectively. Sessions included both physical conditioning and technical elements as led by the coach. Each session was filmed for later notational analysis of stroke count and error rates. Further, players wore a global positioning satellite device to measure distance covered during each session; whilst heart rate, countermovement jump distance and capillary blood measures of metabolites were measured before, during and following each session. Additionally a respective coach and athlete rating of perceived exertion (RPE) were measured following each session. Total duration and distance covered during of each session were comparable (P>0.05; d<0.20). While forehand and backhands stroke volume did not differ between sessions (P>0.05; d<0.30); large effects for increased unforced and forced errors were present on the hard court (P>0.05; d>0.90). Furthermore, large effects for increased heart rate, blood lactate and RPE values were evident on clay compared to hard courts (P>0.05; d>0.90). Additionally, while player and coach RPE on hard courts were similar, there were large effects for coaches to underrate the RPE of players on clay courts (P>0.05; d>0.90). In conclusion, training on clay courts results in trends for increased heart rate, lactate and RPE values, suggesting sessions on clay tend towards higher physiological and perceptual loads than hard courts. Further, coaches appear effective at rating player RPE on hard courts, but may underrate the perceived exertion of sessions on clay courts.
Resumo:
Bomb technicians perform their work while encapsulated in explosive ordnance disposal (EOD) suits. Designed primarily for safety, these suits have an unintended consequence of impairing the body’s natural mechanisms for heat dissipation. Purpose: To quantify the heat strain encountered during an EOD operational scenario in the tropical north of Australia. Methods: All active police male bomb technicians, located in a tropical region of Australia (n=4, experience 7 ± 2.1 yrs, age 34 ± 2 yrs, height 182.3 ± 5.4 cm, body mass 95 ± 4 kg, VO2max 46 ± 5.7 ml.kg-1.min-1) undertook an operational scenario wearing the Med-Eng EOD 9 suit and helmet (~32 kg). The climatic conditions ranged between 27.1–31.8°C ambient temperature, 66-88% relative humidity, and 30.7-34.3°C wet bulb globe temperature. The scenario involved searching a two story non air-conditioned building for a target; carrying and positioning equipment for taking an X-ray; carrying and positioning equipment to disrupt the target; and finally clearing the site. Core temperature and heart rate were continuously monitored, and were used to calculate a physiological strain index (PSI). Urine specific gravity (USG) assessed hydration status and heat associated symptomology were also recorded. Results: The scenario was completed in 121 ± 22 mins (23.4 ± 0.4% work, 76.5 ± 0.4% rest/recovery). Maximum core temperature (38.4 ± 0.2°C), heart rate (173 ± 5.4 bpm, 94 ± 3.3% max), PSI (7.1 ± 0.4) and USG (1.031 ± 0.002) were all elevated after the simulated operation. Heat associated symptomology highlighted that moderate-severe levels of fatigue and thirst were universally experienced, muscle weakness and heat sensations experienced by 75%, and one bomb technician reported confusion and light-headedness. Conclusion: All bomb technicians demonstrated moderate-high levels of heat strain, evidenced by elevated heart rate, core body temperature and PSI. Severe levels of dehydration and noteworthy heat-related symptoms further highlight the risks to health and safety faced by bomb technicians operating in tropical locations.
Resumo:
Noradrenaline which occurs naturally in the body binds to beta-adrenoceptors on the heart, causing the heart to beat faster and with greater force in response to increased demand. This enables the heart to provide oxygenated blood to vital organs. Prolonged overstimulation by noradrenaline can be harmful to the heart and lead to the progression of heart disease. In these circumstances beta-adrenoceptors are blocked with drugs called beta-blockers. Beta-blockers block the effects of noradrenaline by binding to the same site on the beta-adrenoceptor. Some beta-blockers such as CGP12177 can also cause increases in heart rate. Therefore it was proposed that CGP12177 could bind in a different place to noradrenaline. The aim of this study was to determine where CGP12177 binds to on the beta-adrenoceptor. The results have revealed a separate binding site named beta-1-low. These results may lead to the development of improved -blockers for the management of heart conditions.
Resumo:
This study examined the effects of post-exercise cooling on recovery of neuromuscular, physiological, and cerebral hemodynamic responses after intermittent-sprint exercise in the heat. Nine participants underwent three post-exercise recovery trials, including a control (CONT), mixed-method cooling (MIX), and cold-water immersion (10 °C; CWI). Voluntary force and activation were assessed simultaneously with cerebral oxygenation (near-infrared spectroscopy) pre- and post-exercise, post-intervention, and 1-h and 24-h post-exercise. Measures of heart rate, core temperature, skin temperature, muscle damage, and inflammation were also collected. Both cooling interventions reduced heart rate, core, and skin temperature post-intervention (P < 0.05). CWI hastened the recovery of voluntary force by 12.7 ± 11.7% (mean ± SD) and 16.3 ± 10.5% 1-h post-exercise compared to MIX and CONT, respectively (P < 0.01). Voluntary force remained elevated by 16.1 ± 20.5% 24-h post-exercise after CWI compared to CONT (P < 0.05). Central activation was increased post-intervention and 1-h post-exercise with CWI compared to CONT (P < 0.05), without differences between conditions 24-h post-exercise (P > 0.05). CWI reduced cerebral oxygenation compared to MIX and CONT post-intervention (P < 0.01). Furthermore, cooling interventions reduced cortisol 1-h post-exercise (P < 0.01), although only CWI blunted creatine kinase 24-h post-exercise compared to CONT (P < 0.05). Accordingly, improvements in neuromuscular recovery after post-exercise cooling appear to be disassociated with cerebral oxygenation, rather reflecting reductions in thermoregulatory demands to sustain force production.
Resumo:
The purpose of the present study was to examine the influence of 3 different high-intensity interval training regimens on the first and second ventilatory thresholds (VT1 and VT2), anaerobic capacity (ANC), and plasma volume (PV) in well-trained endurance cyclists. Before and after 2 and 4 weeks of training, 38 well-trained cyclists (VO2peak = 64.5 +/- 5.2 ml[middle dot]kg-1[middle dot]min-1) performed (a) a progressive cycle test to measure VO2peak, peak power output (PPO), VT1, and VT2; (b) a time to exhaustion test (Tmax) at their VO2peak power output (Pmax); and (c) a 40-km time-trial (TT40). Subjects were assigned to 1 of 4 training groups (group 1: n = 8, 8 3 60% Tmax at Pmax, 1:2 work-recovery ratio; group 2: n = 9, 8 x 60% Tmax at Pmax, recovery at 65% maximum heart rate; group 3: n = 10, 12 x 30 seconds at 175% PPO, 4.5-minute recovery; control group: n = 11). The TT40 performance, VO2peak, VT1,VT2, and ANC were all significantly increased in groups 1, 2, and 3 (p < 0.05) but not in the control group. However, PV did not change in response to the 4-week training program. Changes in TT40 performance were modestly related to the changes in VO2peak, VT1, VT2, and ANC (r = 0.41, 0.34, 0.42, and 0.40, respectively; all p < 0.05). In conclusion, the improvements in TT40 performance were related to significant increases in VO2peak, VT1,VT2, and ANC but were not accompanied by significant changes in PV. Thus, peripheral adaptations rather than central adaptations are likely responsible for the improved performances witnessed in well-trained endurance athletes following various forms of high-intensity interval training programs.
Resumo:
Several tests have been devised in an attempt to detect behaviour modification due to training, supplements or diet in horses. These tests rely on subjective observations in combination with physiological measures, such as heart rate (HR) and plasma cortisol concentrations, but these measures do not definitively identify behavioural changes. The aim of the present studies was to develop an objective and relevant measure of horse reactivity. In Study 1, HR responses to auditory stimuli, delivered over 6 days, designed to safely startle six geldings confined to individual stalls was studied to determine if peak HR, unconfounded by physical exertion, was a reliable measure of reactivity. Both mean (±SEM) resting HR (39.5 ± 1.9 bpm) and peak HR (82 ± 5.5 bpm) in response to being startled in all horses were found to be consistent over the 6 days. In Study 2, HR, plasma cortisol concentrations and speed of departure from an enclosure (reaction speed (RS)) in response to a single stimulus of six mares were measured when presented daily over 6 days. Peak HR response (133 ± 4 bpm) was consistent over days for all horses, but RS increased (3.02 ± 0.72 m/s on Day 1 increasing to 4.45 ± 0.53 m/s on Day 6; P = 0.005). There was no effect on plasma cortisol, so this variable was not studied further. In Study 3, using the six geldings from Study 1, the RS test was refined and a different startle stimulus was used each day. Again, there was no change in peak HR (97.2 ± 5.8 bpm) or RS (2.9 ± 0.2 m/s on Day 1 versus 3.0 ± 0.7 m/s on Day 6) over time. In the final study, mild sedation using acepromazine maleate (0.04 mg/kg BW i.v.) decreased peak HR in response to a startle stimulus when the horses (n = 8) were confined to a stall (P = 0.006), but not in an outdoor environment when the RS test was performed. However, RS was reduced by the mild sedation (P = 0.02). In conclusion, RS may be used as a practical and objective test to measure both reactivity and changes in reactivity in horses.
Resumo:
Bicycle commuting has the potential to be an effective contributing solution to address some of modern society’s biggest issues, including cardiovascular disease, anthropogenic climate change and urban traffic congestion. However, individuals shifting from a passive to an active commute mode may be increasing their potential for air pollution exposure and the associated health risk. This project, consisting of three studies, was designed to investigate the health effects of bicycle commuters in relation to air pollution exposure, in a major city in Australia (Brisbane). The aims of the three studies were to: 1) examine the relationship of in-commute air pollution exposure perception, symptoms and risk management; 2) assess the efficacy of commute re-routing as a risk management strategy by determining the exposure potential profile of ultrafine particles along commute route alternatives of low and high proximity to motorised traffic; and, 3) evaluate the feasibility of implementing commute re-routing as a risk management strategy by monitoring ultrafine particle exposure and consequential physiological response from using commute route alternatives based on real-world circumstances; 3) investigate the potential of reducing exposure to ultrafine particles (UFP; < 0.1 µm) during bicycle commuting by lowering proximity to motorised traffic with real-time air pollution and acute inflammatory measurements in healthy individuals using their typical, and an alternative to their typical, bicycle commute route. The methods of the three studies included: 1) a questionnaire-based investigation with regular bicycle commuters in Brisbane, Australia. Participants (n = 153; age = 41 ± 11 yr; 28% female) reported the characteristics of their typical bicycle commute, along with exposure perception and acute respiratory symptoms, and amenability for using a respirator or re-routing their commute as risk management strategies; 2) inhaled particle counts measured along popular pre-identified bicycle commute route alterations of low (LOW) and high (HIGH) motorised traffic to the same inner-city destination at peak commute traffic times. During commute, real-time particle number concentration (PNC; mostly in the UFP range) and particle diameter (PD), heart and respiratory rate, geographical location, and meteorological variables were measured. To determine inhaled particle counts, ventilation rate was calculated from heart-rate-ventilation associations, produced from periodic exercise testing; 3) thirty-five healthy adults (mean ± SD: age = 39 ± 11 yr; 29% female) completed two return trips of their typical route (HIGH) and a pre-determined altered route of lower proximity to motorised traffic (LOW; determined by the proportion of on-road cycle paths). Particle number concentration (PNC) and diameter (PD) were monitored in real-time in-commute. Acute inflammatory indices of respiratory symptom incidence, lung function and spontaneous sputum (for inflammatory cell analyses) were collected immediately pre-commute, and one and three hours post-commute. The main results of the three studies are that: 1) healthy individuals reported a higher incidence of specific acute respiratory symptoms in- and post- (compared to pre-) commute (p < 0.05). The incidence of specific acute respiratory symptoms was significantly higher for participants with respiratory disorder history compared to healthy participants (p < 0.05). The incidence of in-commute offensive odour detection, and the perception of in-commute air pollution exposure, was significantly lower for participants with smoking history compared to healthy participants (p < 0.05). Females reported significantly higher incidence of in-commute air pollution exposure perception and other specific acute respiratory symptoms, and were more amenable to commute re-routing, compared to males (p < 0.05). Healthy individuals have indicated a higher incidence of acute respiratory symptoms in- and post- (compared to pre-) bicycle commuting, with female gender and respiratory disorder history indicating a comparably-higher susceptibility; 2) total mean PNC of LOW (compared to HIGH) was reduced (1.56 x e4 ± 0.38 x e4 versus 3.06 x e4 ± 0.53 x e4 ppcc; p = 0.012). Total estimated ventilation rate did not vary significantly between LOW and HIGH (43 ± 5 versus 46 ± 9 L•min; p = 0.136); however, due to total mean PNC, accumulated inhaled particle counts were 48% lower in LOW, compared to HIGH (7.6 x e8 ± 1.5 x e8 versus 14.6 x e8 ± 1.8 x e8; p = 0.003); 3) LOW resulted in a significant reduction in mean PNC (1.91 x e4 ± 0.93 x e4 ppcc vs. 2.95 x e4 ± 1.50 x e4 ppcc; p ≤ 0.001). Commute distance and duration were not significantly different between LOW and HIGH (12.8 ± 7.1 vs. 12.0 ± 6.9 km and 44 ± 17 vs. 42 ± 17 mins, respectively). Besides incidence of in-commute offensive odour detection (42 vs. 56 %; p = 0.019), incidence of dust and soot observation (33 vs. 47 %; p = 0.038) and nasopharyngeal irritation (31 vs. 41 %; p = 0.007), acute inflammatory indices were not significantly associated to in-commute PNC, nor were these indices reduced with LOW compared to HIGH. The main conclusions of the three studies are that: 1) the perception of air pollution exposure levels and the amenability to adopt exposure risk management strategies where applicable will aid the general population in shifting from passive, motorised transport modes to bicycle commuting; 2) for bicycle commuting at peak morning commute times, inhaled particle counts and therefore cardiopulmonary health risk may be substantially reduced by decreasing exposure to motorised traffic, which should be considered by both bicycle commuters and urban planners; 3) exposure to PNC, and the incidence of offensive odour and nasopharyngeal irritation, can be significantly reduced when utilising a strategy of lowering proximity to motorised traffic whilst bicycle commuting, without significantly increasing commute distance or duration, which may bring important benefits for both healthy and susceptible individuals. In summary, the findings from this project suggests that bicycle commuters can significantly lower their exposure to ultrafine particle emissions by varying their commute route to reduce proximity to motorised traffic and associated combustion emissions without necessarily affecting their time of commute. While the health endpoints assessed with healthy individuals were not indicative of acute health detriment, individuals with pre-disposing physiological-susceptibility may benefit considerably from this risk management strategy – a necessary research focus with the contemporary increased popularity of both promotion and participation in bicycle commuting.
Resumo:
The increasing prevalence of obesity in society has been associated with a number of atherogenic risk factors such as insulin resistance. Aerobic training is often recommended as a strategy to induce weight loss, with a greater impact of high-intensity levels on cardiovascular function and insulin sensitivity, and a greater impact of moderate-intensity levels on fat oxidation. Anaerobic high-intensity (supramaximal) interval training has been advocated to improve cardiovascular function, insulin sensitivity and fat oxidation. However, obese individuals tend to have a lower tolerance of high-intensity exercise due to discomfort. Furthermore, some obese individuals may compensate for the increased energy expenditure by eating more and/or becoming less active. Recently, both moderate- and high-intensity aerobic interval training have been advocated as alternative approaches. However, it is still uncertain as to which approach is more effective in terms of increasing fat oxidation given the issues with levels of fitness and motivation, and compensatory behaviours. Accordingly, the objectives of this thesis were to compare the influence of moderate- and high-intensity interval training on fat oxidation and eating behaviour in overweight/obese men. Two exercise interventions were undertaken by 10-12 overweight/obese men to compare their responses to study variables, including fat oxidation and eating behaviour during moderate- and high-intensity interval training (MIIT and HIIT). The acute training intervention was a methodological study designed to examine the validity of using exercise intensity from the graded exercise test (GXT) - which measured the intensity that elicits maximal fat oxidation (FATmax) - to prescribe interval training during 30-min MIIT. The 30-min MIIT session involved 5-min repetitions of workloads 20% below and 20% above the FATmax. The acute intervention was extended to involve HIIT in a cross-over design to compare the influence of MIIT and HIIT on eating behaviour using subjective appetite sensation and food preference through the liking and wanting test. The HIIT consisted of 15-sec interval training at 85 %VO2peak interspersed by 15-sec unloaded recovery, with a total mechanical work equal to MIIT. The medium term training intervention was a cross-over 4-week (12 sessions) MIIT and HIIT exercise training with a 6-week detraining washout period. The MIIT sessions consisted of 5-min cycling stages at ±20% of mechanical work at 45 %VO2peak, and the HIIT sessions consisted of repetitive 30-sec work at 90 %VO2peak and 30-sec interval rests, during identical exercise sessions of between 30 and 45 mins. Assessments included a constant-load test (45 %VO2peak for 45 mins) followed by 60-min recovery at baseline and the end of 4-week training, to determine fat oxidation rate. Participants’ responses to exercise were measured using blood lactate (BLa), heart rate (HR) and rating of perceived exertion (RPE) and were measured during the constant-load test and in the first intervention training session of every week during training. Eating behaviour responses were assessed by measuring subjective appetite sensations, liking and wanting and ad libitum energy intake. Results of the acute intervention showed that FATmax is a valid method to estimate VO2 and BLa, but is not valid to estimate HR and RPE in the MIIT session. While the average rate of fat oxidation during 30-min MIIT was comparable with the rate of fat oxidation at FATmax (0.16 ±0.09 and 0.14 ±0.08 g/min, respectively), fat oxidation was significantly higher at minute 25 of MIIT (P≤0.01). In addition, there was no significant difference between MIIT and HIIT in the rate of appetite sensations after exercise, but there was a tendency towards a lower rate of hunger after HIIT. Different intensities of interval exercise also did not affect explicit liking or implicit wanting. Results of the medium-term intervention indicated that current interval training levels did not affect body composition, fasting insulin and fasting glucose. Maximal aerobic capacity significantly increased (P≤0.01) (2.8 and 7.0% after MIIT and HIIT respectively) during GXT, and fat oxidation significantly increased (P≤0.01) (96 and 43% after MIIT and HIIT respectively) during the acute constant-load exercise test. RPE significantly decreased after HIIT greater than MIIT (P≤0.05), and the decrease in BLa was greater during the constant-load test after HIIT than MIIT, but this difference did not reach statistical significance (P=0.09). In addition, following constant-load exercise, exercise-induced hunger and desire to eat decreased after HIIT greater than MIIT but were not significant (p value for desire to eat was 0.07). Exercise-induced liking of high-fat sweet (HFSW) and high-fat non-sweet (HFNS) foods increased after MIIT and decreased after HIIT (p value for HFNS was 0.09). The intervention explained 12.4% of the change in fat intake (p = 0.07). This research is significant in that it confirmed two points in the acute study. While the rate of fat oxidation increased during MIIT, the average rate of fat oxidation during 30-min MIIT was comparable with the rate of fat oxidation at FATmax. In addition, manipulating the intensity of acute interval exercise did not affect appetite sensations and liking and wanting. In the medium-term intervention, constant-load exercise-induced fat oxidation significantly increased after interval training, independent of exercise intensity. In addition, desire to eat, explicit liking for HFNS and fat intake collectively confirmed that MIIT is accompanied by a greater compensation of eating behaviour than HIIT. Findings from this research will assist in developing exercise strategies to provide obese men with various training options. In addition, the finding that overweight/obese men expressed a lower RPE and decreased BLa after HIIT compared with MIIT is contrary to the view that obese individuals may not tolerate high-intensity interval training. Therefore, high-intensity interval training can be advocated among the obese adult male population. Future studies may extend this work by using a longer-term intervention.
Resumo:
Explosive ordnance disposal (EOD) technicians are required to wear protective clothing to protect themselves from the threat of overpressure, fragmentation, impact and heat. The engineering requirements to minimise these threats results in an extremely heavy and cumbersome clothing ensemble that increases the internal heat generation of the wearer, while the clothing’s thermal properties reduce heat dissipation. This study aimed to evaluate the heat strain encountered wearing EOD protective clothing in simulated environmental extremes across a range of differing work intensities. Eight healthy males [age 25±6 years (mean ± sd), height 180±7 cm, body mass 79±9 kg, V˙O2max 57±6 ml.kg−1.min−1] undertook nine trials while wearing an EOD9 suit (weighing 33.4 kg). The trials involved walking on a treadmill at 2.5, 4 and 5.5 km⋅h−1 at each of the following environmental conditions, 21, 30 and 37°C wet bulb globe temperature (WBGT) in a randomised controlled crossover design. The trials were ceased if the participants’ core temperature reached 39°C, if heart rate exceeded 90% of maximum, if walking time reached 60 minutes or due to fatigue/nausea. Tolerance times ranged from 10–60 minutes and were significantly reduced in the higher walking speeds and environmental conditions. In a total of 15 trials (21%) participants completed 60 minutes of walking; however, this was predominantly at the slower walking speeds in the 21°C WBGT environment. Of the remaining 57 trials, 50 were ceased, due to attainment of 90% maximal heart rate. These near maximal heart rates resulted in moderate-high levels of physiological strain in all trials, despite core temperature only reaching 39°C in one of the 72 trials.
Resumo:
Objective Dehydration and symptoms of heat illness are common among the surface mining workforce. This investigation aimed to determine whether heat strain and hydration status exceeded recommended limits. Methods Fifteen blast crew personnel operating in the tropics were monitored across a 12-hour shift. Heart rate, core body temperature, and urine-specific gravity were continuously recorded. Participants self-reported fluid consumption and completed a heat illness symptom inventory. Results Core body temperature averaged 37.46 +/- 0.13[degrees]C, with the group maximum 37.98 +/- 0.19[degrees]C. Mean urine-specific gravity was 1.024 +/- 0.007, with 78.6% of samples 1.020 or more. Seventy-three percent of workers reported at least one symptom of heat illness during the shift. Conclusions Core body temperature remained within the recommended limits; however, more than 80% of workers were dehydrated before commencing the shift, and tended to remain so for the duration.
Resumo:
Background and Purpose The β1-adrenoceptor has at least two binding sites, high and low affinity sites (β1H and β1L, respectively), which mediate cardiostimulation. While β1H-adrenoceptor can be blocked by all clinically used β-blockers, β1L-adrenoceptor is relatively resistant to blockade. Thus, chronic β1L-adrenoceptor activation may mediate persistent cardiostimulation, despite the concurrent blockade of β1H-adrenoceptors. Hence, it is important to determine the potential significance of β1L-adrenoceptors in vivo, particularly in pathological situations. Experimental Approach C57Bl/6 male mice were used. Chronic (4 or 8 weeks) β1L-adrenoceptor activation was achieved by treatment, via osmotic mini pumps, with (-)-CGP12177 (10 mg·kg−1·day−1). Cardiac function was assessed by echocardiography and micromanometry. Key Results (-)-CGP12177 treatment of healthy mice increased heart rate and left ventricular (LV) contractility. (-)-CGP12177 treatment of mice subjected to transverse aorta constriction (TAC), during weeks 4–8 or 4–12 after TAC, led to a positive inotropic effect and exacerbated fibrogenic signalling while cardiac hypertrophy tended to be more severe. (-)-CGP12177 treatment of mice with TAC also exacerbated the myocardial expression of hypertrophic, fibrogenic and inflammatory genes compared to untreated TAC mice. Washout of (-)-CGP12177 revealed a more pronounced cardiac dysfunction after 12 weeks of TAC. Conclusions and Implications β1L-adrenoceptor activation provides functional support to the heart, in both normal and pathological (pressure overload) situations. Sustained β1L-adrenoceptor activation in the diseased heart exacerbates LV remodelling and therefore may promote disease progression from compensatory hypertrophy to heart failure.
Resumo:
Purpose The purpose of this study was to evaluate age and gender differences in objectively measured physical activity (PA) in a population-based sample of students in grades 1–12. Methods Participants (185 male, 190 female) wore a CSA 7164 accelerometer for 7 consecutive days. To examine age-related trends, students were grouped as follows: grades 1–3 (N = 90), grades 4–6 (N = 91), grades 7–9 (N = 96), and grades 10–12 (N = 92). Bouts of PA and minutes spent in moderate-to-vigorous PA (MVPA) and vigorous PA (VPA) were examined. Results Daily MVPA and VPA exhibited a significant inverse relationship with grade level, with the largest differences occurring between grades 1–3 and 4–6. Boys were more active than girls; however, for overall PA, the magnitudes of the gender differences were modest. Participation in continuous 20-min bouts of PA was low to nonexistent. Conclusion Our results support the notion that PA declines rapidly during childhood and adolescence and that accelerometers are feasible alternatives to self-report methods in moderately sized population-level surveillance studies.
Resumo:
Purpose The purpose of this review is to address important methodological issues related to conducting accelerometer-based assessments of physical activity in free-living individuals. Methods We review the extant scientific literature for empirical information related to the following issues: product selection, number of accelerometers needed, placement of accelerometers, epoch length, and days of monitoring required to estimate habitual physical activity. We also discuss the various options related to distributing and collecting monitors and strategies to enhance compliance with the monitoring protocol. Results No definitive evidence exists currently to indicate that one make and model of accelerometer is more valid and reliable than another. Selection of accelerometer therefore remains primarily an issue of practicality, technical support, and comparability with other studies. Studies employing multiple accelerometers to estimate energy expenditure report only marginal improvements in explanatory power. Accelerometers are best placed on hip or the lower back. Although the issue of epoch length has not been studied in adults, the use of count cut points based on 1-min time intervals maybe inappropriate in children and may result in underestimation of physical activity. Among adults, 3–5 d of monitoring is required to reliably estimate habitual physical activity. Among children and adolescents, the number of monitoring days required ranges from 4 to 9 d, making it difficult to draw a definitive conclusion for this population. Face-to-face distribution and collection of accelerometers is probably the best option in field-based research, but delivery and return by express carrier or registered mail is a viable option. Conclusion Accelerometer-based activity assessments requires careful planning and the use of appropriate strategies to increase compliance.