59 resultados para High intensity physical training
Resumo:
Resistance training has been shown to reliably and substantially enhance muscle function in older adults and these improvements can be accompanied by improved functional performance. Training variables should be manipulated to enhance muscle strength and minimize injury risks in this population.
Resumo:
Physiological and kinematic data were collected from elite under-19 rugby union players to provide a greater understanding of the physical demands of rugby union. Heart rate, blood lactate and time-motion analysis data were collected from 24 players (mean +/- s((x) over bar): body mass 88.7 +/- 9.9 kg, height 185 +/- 7 cm, age 18.4 +/- 0.5 years) during six competitive premiership fixtures. Six players were chosen at random from each of four groups: props and locks, back row forwards, inside backs, outside backs. Heart rate records were classified based on percent time spent in four zones (>95%, 85-95%, 75-84%, <75% HRmax). Blood lactate concentration was measured periodically throughout each match, with movements being classified as standing, walking, jogging, cruising, sprinting, utility, rucking/mauling and scrummaging. The heart rate data indicated that props and locks (58.4%) and back row forwards (56.2%) spent significantly more time in high exertion (85-95% HRmax) than inside backs (40.5%) and outside backs (33.9%) (P < 0.001). Inside backs (36.5%) and outside backs (38.5%) spent significantly more time in moderate exertion (75-84% HRmax) than props and locks (22.6%) and back row forwards (19.8%) (P < 0.05). Outside backs (20.1%) spent significantly more time in low exertion (< 75% HRmax) than props and locks (5.8%) and back row forwards (5.6%) (P < 0.05). Mean blood lactate concentration did not differ significantly between groups (range: 4.67 mmol.l(-1) for outside backs to 7.22 mmol.l(-1) for back row forwards; P < 0.05). The motion analysis data indicated that outside backs (5750 m) covered a significantly greater total distance than either props and locks or back row forwards (4400 and 4080 m, respectively; P < 0.05). Inside backs and outside backs covered significantly greater distances walking (1740 and 1780 m, respectively; P < 0.001), in utility movements (417 and 475 m, respectively; P < 0.001) and sprinting (208 and 340 m, respectively; P < 0.001) than either props and locks or back row forwards (walking: 1000 and 991 m; utility movements: 106 and 154 m; sprinting: 72 and 94 m, respectively). Outside backs covered a significantly greater distance sprinting than inside backs (208 and 340 m, respectively; P < 0.001). Forwards maintained a higher level of exertion than backs, due to more constant motion and a large involvement in static high-intensity activities. A mean blood lactate concentration of 4.8-7.2 mmol.l(-1) indicated a need for 'lactate tolerance' training to improve hydrogen ion buffering and facilitate removal following high-intensity efforts. Furthermore, the large distances (4.2-5.6 km) covered during, and intermittent nature of, match-play indicated a need for sound aerobic conditioning in all groups (particularly backs) to minimize fatigue and facilitate recovery between high-intensity efforts.
Resumo:
The adaptations of muscle to sprint training can be separated into metabolic and morphological changes. Enzyme adaptations represent a major metabolic adaptation to sprint training, with the enzymes of all three energy systems showing signs of adaptation to training and some evidence of a return to baseline levels with detraining. Myokinase and creatine phosphokinase have shown small increases as a result of short-sprint training in some studies and elite sprinters appear better able to rapidly breakdown phosphocreatine (PCr) than the sub-elite. No changes in these enzyme levels have been reported as a result of detraining. Similarly, glycolytic enzyme activity (notably lactate dehydrogenase, phosphofructokinase and glycogen phosphorylase) has been shown to increase after training consisting of either long (> 10-second) or short (< 10-second) sprints. Evidence suggests that these enzymes return to pre-training levels after somewhere between 7 weeks and 6 months of detraining. Mitochondrial enzyme activity also increases after sprint training, particularly when long sprints or short recovery between short sprints are used as the training stimulus. Morphological adaptations to sprint training include changes in muscle fibre type, sarcoplasmic reticulum, and fibre cross-sectional area. An appropriate sprint training programme could be expected to induce a shift toward type Ha muscle, increase muscle cross-sectional area and increase the sarcoplasmic reticulum volume to aid release of Ca2+. Training volume and/or frequency of sprint training in excess of what is optimal for an individual, however, will induce a shift toward slower muscle contractile characteristics. In contrast, detraining appears to shift the contractile characteristics towards type IIb, although muscle atrophy is also likely to occur. Muscle conduction velocity appears to be a potential non-invasive method of monitoring contractile changes in response to sprint training and detraining. In summary, adaptation to sprint training is clearly dependent on the duration of sprinting, recovery between repetitions, total volume and frequency of training bouts. These variables have profound effects on the metabolic, structural and performance adaptations from a sprint-training programme and these changes take a considerable period of time to return to baseline after a period of detraining. However, the complexity of the interaction between the aforementioned variables and training adaptation combined with individual differences is clearly disruptive to the transfer of knowledge and advice from laboratory to coach to athlete.
Resumo:
Increased professionalism in rugby has elicited rapid changes in the fitness profile of elite players. Recent research, focusing on the physiological and anthropometrical characteristics of rugby players, and the demands of competition are reviewed. The paucity of research on contemporary elite rugby players is highlighted, along with the need for standardised testing protocols. Recent data reinforce the pronounced differences in the anthropometric and physical characteristics of the forwards and backs. Forwards are typically heavier, taller, and have a greater proportion of body fat than backs. These characteristics are changing, with forwards developing greater total mass and higher muscularity. The forwards demonstrate superior absolute aerobic and anaerobic power, and Muscular strength. Results favour the backs when body mass is taken into account. The scaling of results to body mass can be problematic and future investigations should present results using power function ratios. Recommended tests for elite players include body mass and skinfolds, vertical jump, speed, and the multi-stage shuttle run. Repeat sprint testing is a possible avenue for more specific evaluation of players. During competition, high-intensity efforts are often followed by periods of incomplete recovery. The total work over the duration of a game is lower in the backs compared with the forwards; forwards spend greater time in physical contact with the opposition while the backs spend more time in free running, allowing them to cover greater distances. The intense efforts undertaken by rugby players place considerable stress on anaerobic energy sources, while the aerobic system provides energy during repeated efforts and for recovery. Training should focus on repeated brief high-intensity efforts with short rest intervals to condition players to the demands of the game. Training for the forwards should emphasise the higher work rates of the game, while extended rest periods can be provided to the backs. Players should not only be prepared for the demands of competition, but also the stress of travel and extreme environmental conditions. The greater professionalism of rugby union has increased scientific research in the sport; however, there is scope for significant refinement of investigations on the physiological demands of the game, and sports-specific testing procedures.
Resumo:
Background. The purpose of this study was to examine the reliability of stage of change (SOC) measures for moderate-intensity and vigorous physical activity in two separate samples of young adults. Staging measures have focused on vigorous exercise, but current public health guidelines emphasize moderate-intensity activity. Method. For college students in the USA (n = 105) and in Australia (n = 123), SOC was assessed separately on two occasions for moderate-intensity activity and for vigorous activity. Test-retest repeatability was determined, using Cohen's kappa coefficient. Results. In both samples, the reliability scores for the moderate-intensity physical activity staging measure were lower than the scores for the vigorous exercise staging measure. Weighted kappa values for the moderate-intensity staging measure were in the fair to good range for both studies (0.50 and 0.45); for the vigorous staging measure kappa values were excellent and fair to good (0.76 and 0.72). Conclusions. There is a need to standardize and improve methods for staging moderate-intensity activity, given that such measures are used in public health interventions targeting HEPA (health-enhancing physical activity). (C) 2003 American Health Foundation and Elsevier Science (USA). All rights reserved.
Resumo:
Resistance training has been shown to be the most effective exercise mode to induce anabolic adaptations in older men and women. Advances in imaging techniques and histochemistry have increased the ability to detect such changes, confirming the high level of adaptability that remains in aging skeletal muscle. This brief review presents a summary of the resistance-training studies that directly compare chronic anabolic responses to training in older (> 60 years) men and women. Sixteen studies are summarized, most of which indicate similar relative anabolic responses between older men and women after resistance training. Relatively small sample sizes in most of the interventions limited their ability to detect significant sex differences and should be considered when interpreting these studies. Future research should incorporate larger sample sizes with multiple measurement time points for anabolic responses.
Resumo:
Background: The proportion of Australian adults achieving physical activity levels believed to be sufficient for colon cancer prevention was estimated, and sociodemographic correlates (age, gender, educational attainment, occupation, marital status, and children in household) of meeting these levels of activity were analyzed. Methods: Data from the 2000 National Physical Activity Survey were used to estimate the prevalence of participation in physical activity in relation to three criteria: generic public health recommendations, weekly amount of at least moderate-intensity physical activity currently believed to reduce risk of colon cancer, and weekly amount of vigorous-intensity physical activity believed to reduce risk of colon cancer. Results: Overall, 46% of adults met the generic public health criterion, 26% met the colon cancer criterion based on participation in at least moderate-intensity physical activity, and 10% met the colon cancer criterion based on vigorous-intensity physical activity. Women were less likely than men to meet the colon cancer criteria. Younger and more educated persons were more likely to meet all three criteria. The most pronounced differences between gender, age, and educational attainment groups were found for meeting the amount of vigorous-intensity physical activity believed to reduce risk of colon cancer. Conclusions: The population prevalence for meeting proposed physical activity criteria for colon cancer prevention is low and much lower than that related to the more generic public health recommendations. If further epidemiologic studies confirm that high volumes and intensities of activity are required, the public health challenges for colon cancer will be significant.
Resumo:
Background: The effective evaluation of physical activity interventions for older adults requires measurement instruments with acceptable psychometric properties that are sufficiently sensitive to detect changes in this population. Aim: To assess the measurement properties (reliability and validity) of the Community Healthy Activities Model Program for Seniors (CHAMPS) questionnaire in a sample of older Australians. Methods: CHAMPS data were collected from 167 older adults (mean age 79.1 S.D. 6.3 years) and validated with tests of physical ability and the SF-12 measures of physical and mental health. Responses from a sub-sample of 43 older adults were used to assess 1-week test-retest reliability. Results: Approximately 25% of participants needed assistance to complete the CHAMPS questionnaire. There were low but significant correlations between the CHAMPS scores and the physical performance measures (rho=0.14-0.32) and the physical health scale of the SF-12 (rho=0.12-0.24). Reliability coefficients were highest for moderate-intensity (ICC=0.81-0.88) and lowest for vigorous-intensity physical activity (ICC=0.34-0.45). Agreement between test-retest estimates of sufficient physical activity for health benefits (>= 150 min and >= 5 sessions per week) was high (percent agreement = 88% and Cohen's kappa = 0.68). Conclusion: These findings suggest that the CHAMPS questionnaire has acceptable measurement properties, and is therefore suitable for use among older Australian adults, as long as adequate assistance is provided during administration. (c) 2006 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Resumo:
At least 30 minutes of moderate-intensity physical activity accumulated on most, preferably all days is considered the minimum level necessary to reduce the risk of developing cardiovascular disease. Despite an unclear explanation, some epidemiological data paradoxically suggest that a very high volume of exercise is associated with a decrease in cardiovascular health. Although ultra-endurance exercise training has been shown to increase antioxidant defences (and therefore confer a protective effect against oxidative stress), an increase in oxidative stress may contribute to the development of atherosclerosis via oxidative modification of low-density lipoprotein (LDL). Research has also shown that ultra-endurance exercise is associated with acute cardiac dysfunction and injury, and these may also be related to an increase in free radical production. Longitudinal studies are needed to assess whether antioxidant defences are adequate to prevent LDL oxidation that may occur as a result of increased free radical production during very high volumes of exercise. In addition, this work will assist in understanding the accrued effect of repeated ultra-endurance exercise-induced myocardial damage.
Resumo:
Purpose: To examine age-related differences in the physical activity behaviors of young adults. Methods: We examined rates of participation in vigorous- and moderate-intensity leisure-time activity and walking, as well as an index of physical activity sufficient for health benefits in three Australian cross-sectional samples, for the age ranges of 18-19, 20-24, and 25-29 yr. Data were collected in 1991, 1996, and 1997/8. Results: There was at least a 15% difference in vigorous-intensity leisure-time physical activity from the 18-19 yr to the 25-29 yr age groups, and at least a 10% difference in moderate-intensity leisure-time physical activity. For the index of sufficient activity there was a difference between 9 and 21% across age groups. Differences in rates of walking were less than 8%. For all age groups, males had higher rates of participation for vigorous and moderate-intensity activity than did females, bur females had much higher rates of participation in walking than males. Age-associated differences in activity levels were more apparent for males. Conclusions: Promoting walking and various forms of moderate-intensity physical activities to young adult males, and encouraging young adult females to adopt other forms of moderate-intensity activity to complement walking may help to ameliorate decreases in physical activity over the adult lifespan.
Resumo:
Information processing accounts propose that autonomic orienting reflects the amount of resources allocated to process a stimulus. However, secondary task reaction time (RT), a supposed measure of processing resources, has shown a dissociation from autonomic orienting. The present study tested the hypothesis that secondary task RT reflects a serial processing mechanism. Participants (N = 24) were presented with circle and ellipse shapes and asked to count the number of longer-than-usual presentations of one shape (task-relevant) and to ignore presentations of a second shape (task-irrelevant). Concurrent with the counting task, participants performed a secondary RT task to an auditory probe presented at either a high or low intensity and at two different probe positions following shape onset (50 and 300 ms). Electrodermal orienting was larger during task-relevant shapes than during task-irrelevant shapes, but secondary task RT to the high-intensity probe was slower during the latter. In addition, an underadditive interaction between probe stimulus intensity and probe position was found in secondary RT. The findings are consistent with a serial processing model of secondary RT and suggest that the notion of processing stages should be incorporated into current information-processing models of autonomic orienting.
Resumo:
Tennis played at an elite level requires intensive training characterized by repeated bouts of brief intermittent high intensity exercise over relatively long periods of time (1 - 3 h or more). Competition can place additional stress on players. The purpose of this study was to investigate the temporal association between specific components of tennis training and competition, the incidence of upper respiratory tract infections (URT1), and salivary IgA, in a cohort of seventeen elite female tennis players. Timed, whole unstimulated saliva samples were collected before and after selected 1-h training sessions at 2 weekly intervals, over 12 weeks. Salivary IgA concentration was measured by ELISA and IgA secretion rate calculated (mug IgA x ml(-1) x ml saliva x min(-1)). Players reported URTI symptoms and recorded training and competition in daily logs. Data analysis showed that higher incidence of URTI was significantly associated with increased training duration and load, and competition level, on a weekly basis. Salivary IgA secretion rate (S-IgA) dropped significantly after 1 hour of tennis play. Over the 12-week period, pre-exercise salivary IgA concentration and secretion rate were directly associated with the amount of training undertaken during the previous day and week (p < 0.05). However, the decline in S-IgA after 1 h of intense tennis play was also positively related to the duration and load of training undertaken during the previous day and week (p < 0.05). Although exercise-induced suppression of salivary IgA may be a risk factor, it could not accurately predict the occurrence of URTI in this cohort of athletes.
Resumo:
A study of the prevalence, intensity and risk factors for soil-transmitted helminth infection was undertaken among school children aged 5-9 years attending a primary school in the fishing village in Peda Jalaripet, Visakhapatnam, South India. One hundred and eighty nine (92.6%) of 204 children were infected with one or more soil transmitted helminth parasites. The predominant parasite was Ascaris lumbricoides (prevalence of 91%), followed by Trichuris trichiura (72%) and hookworm (54%). Study of age-specific prevalence and intensity of infection revealed that the prevalence and intensity of A. lumbricoides infection was higher among younger children than older children. While aggregation of parasite infection was observed, hookworm infection was more highly aggregated than either A. lumbricoides or T. trichiura. Multivariate analysis identified parental occupation, child's age and mother's education as the potential risk factors contributing to the high intensity of A. lumbricoides infection. Children from fishing families with low levels of education of the mother had the highest intensity of A. lumbricoides infection. As the outcome of chemotherapy programs to control soil transmitted helminth infection is dependant on the dynamics of their transmission, there is a need for further studies to better define the role of specific factors that determine their prevalence, intensity and aggregation in different epidemiological settings. (C) 2004 Elsevier B.V. All rights reserved.