830 resultados para Injuries in athletes


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Hamstring strain injuries (HSI) are the predominant non-contact injury in many sports. Eccentric hamstring muscle weakness following intermittent running has been implicated within the aetiology of HSI. This weakness following intermittent running is sometimes greater eccentrically than concentrically, however the cause of this unique, contraction mode specific phenomenon is unknown. The purpose of this research was to determine whether declines in knee flexor strength following overground repeat sprints are caused by declines in voluntary activation of the hamstring muscles. Methods: Seventeen recreationally active males completed 3 sets of 6 by 20m overground sprints. Maximal isokinetic concentric and eccentric knee flexor and concentric knee extensor strength was determined at ±1800.s-1 and ±600.s-1 while hamstring muscle activation was assessed using surface electromyography, before and 15 minutes after the running protocol. Results: Overground repeat sprint running resulted in a significant decline in eccentric knee flexor strength (31.1 Nm; 95% CI = 21.8 to 40.3 Nm; p < 0.001). However, concentric knee flexor strength was not significantly altered (11.1 Nm; 95% CI= -2.8 to 24.9; p=0.2294). Biceps femoris voluntary activation levels displayed a significant decline eccentrically (0.067; 95% CI=0.002 to 0.063; p=0.0325). However, there was no significant decline concentrically (0.025; 95% CI=-0.018 to 0.043; p=0.4243) following sprinting. Furthermore, declines in average peak torque at -1800.s-1 could be explained by changes in hamstring activation (R2 = 0.70). Moreover, it was change in the lateral hamstring muscle activity that was related to the decrease in knee flexor torque (p = 0.0144). In comparison, medial hamstring voluntary activation showed no change for either eccentric (0.06; 95% CI = -0.033 to 0.102; p=0.298) or concentric (0.09; 95% CI = -0.03 to 0.16; p=0.298) muscle actions following repeat sprinting. Discussion: Eccentric hamstring strength is decreased significantly following overground repeat sprinting. Voluntary activation deficits in the biceps femoris muscle explain a large portion of this weakness. The implications of these findings are significant as the biceps femoris muscle is the most frequently strained of the knee flexors and fatigue is implicated in the aetiology of this injury.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Hamstring strain injuries (HSI) are prevalent in sport and re-injury rates have been high for many years. Maladaptation following HSI are implicated in injury recurrence however nervous system function following HSI has received little attention. Aim: To determine if recreational athletes with a history of unilateral HSI, who have returned to training and competition, will exhibit lower levels of voluntary activation (VA) and median power frequency (MPF) in the previously injured limb compared to the uninjured limb at long muscle lengths. Methods: Twenty-eight recreational athletes were recruited. Of these, 13 athletes had a history of unilateral HSI and 15 had no history of HSI. Following familiarisation, all athletes undertook isokinetic dynamometry testing and surface electromyography assessment of the biceps femoris long head and medial hamstrings during concentric and eccentric contractions at ± 180 and ± 60deg/s. Results: The previously injured limb was weaker at all contraction speeds compared to the uninjured limb (+180deg/s mean difference(MD) = 9.3Nm, p = 0.0036; +60deg/s MD = 14.0Nm, p = 0.0013; -60deg/s MD = 18.3Nm, p = 0.0007; -180deg/s MD = 20.5Nm, p = 0.0007) whilst VA was only lower in the biceps femoris long head during eccentric contractions (-60deg/s MD = 0.13, p = 0.0025; -180deg/s MD = 0.13, p = 0.0003). There were no between limb differences in medial hamstring VA or MPF from either biceps femoris long head or medial hamstrings in the injured group. The uninjured group showed no between limb differences with any of the tested variables. Conclusion: Previously injured hamstrings were weaker than the contralateral uninjured hamstring at all tested speeds and contraction modes. During eccentric contractions biceps femoris long head VA was lower in the previously injured limb suggesting neural control of biceps femoris long head may be altered following HSI. Current rehabilitation practices have been unsuccessful in restoring strength and VA following HSI. Restoration of these markers should be considered when determining the success of rehabilitation from HSI. Further investigations are required to elucidate the full impact of lower levels of biceps femoris long head VA following HSI on rehabilitation outcomes and re-injury risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Hamstring strain injuries (HSIs) are prevalent in sport and re-injury rates have been high for many years. Whilst much focus has centred on the impact of previous hamstring strain injury on maximal eccentric strength, high rates of torque development is also of interest, given the important role of the hamstrings during the terminal swing phase of gait. The impact of prior strain injury on neuromuscular function of the hamstrings during tasks requiring high rates of torque development has received little attention. The purpose of this study is to determine if recreational athletes with a history of unilateral hamstring strain injury, who have returned to training and competition, will exhibit lower levels of eccentric muscle activation, rate of torque development and impulse 30, 50 and 100ms after the onset of electromyographical or torque development in the previously injured limb compared to the uninjured limb. Methods: Twenty-six recreational athletes were recruited. Of these, 13 athletes had a history of unilateral hamstring strain injury (all confined to biceps femoris long head) and 13 had no history of hamstring strain injury. Following familiarisation, all athletes undertook isokinetic dynamometry testing and surface electromyography assessment of the biceps femoris long head and medial hamstrings during eccentric contractions at -60 and -1800.s-1. Results: In the injured limb of the injured group, compared to the contralateral uninjured limb rate of torque development and impulse was lower during -600.s-1 eccentric contractions at 50 (RTD, p=0.008; IMP, p=0.005) and 100ms (RTD, p=0.001; IMP p<0.001) after the onset of contraction. There was also a non-significant trend for rate of torque development during -1800.s-1 to be lower 100ms after onset of contraction (p=0.064). Biceps femoris long head muscle activation was lower at 100ms at both contraction speeds (-600.s-1, p=0.009; -1800.s-1, p=0.009). Medial hamstring activation did not differ between limbs in the injured group. Comparisons in the uninjured group showed no significant between limbs difference for any variables. Conclusion: Previously injured hamstrings displayed lower rate of torque development and impulse during eccentric contraction. Lower muscle activation was confined to the biceps femoris long head. Regardless of whether these deficits are the cause of or the result of injury, these findings have important implications for hamstring strain injury and re-injury and suggest greater attention be given to neural function of the knee flexors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although there is a paucity of scientific support for the benefits of warm-up, athletes commonly warm up prior to activity with the intention of improving performance and reducing the incidence of injuries. The purpose of this study was to examine the role of warm-up intensity on both range of motion (ROM) and anaerobic performance. Nine males (age = 21.7 +/- 1.6 years, height = 1.77 +/- 0.04 m, weight = 80.2 +/- 6.8 kg, and VO2max = 60.4 +/- 5.4 ml/kg/min) completed four trials. Each trial consisted of hip, knee, and ankle ROM evaluation using an electronic inclinometer and an anaerobic capacity test on the treadmill (time to fatigue at 13 km/hr and 20% grade). Subjects underwent no warm-up or a warm-up of 15 minutes running at 60, 70 or 80% VO2max followed by a series of lower limb stretches. Intensity of warm-up had little effect on ROM, since ankle dorsiflexion and hip extension significantly increased in all warm-up conditions, hip flexion significantly increased only after the 80% VO2max warm-up, and knee flexion did not change after any warm-up. Heart rate and body temperature were significantly increased (p < 0.05) prior to anaerobic performance for each of the warm-up conditions, but anaerobic performance improved significantly only after warm-up at 60% VO2max (10%) and 70% VO2max (13%). A 15-minute warm-up at an intensity of 60-70% VO2max is therefore recommended to improve ROM and enhance subsequent anaerobic performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Truancy is recognised as an indicator of engagement in high-risk behaviours for adolescents. Injuries from road related risk behaviours continue to be a leading cause of death and disability for early adolescents (13-14 years). The aim of this research is to determine the extent to which truancy relates to increased risk of road related injuries for early adolescents. Four hundred and twenty-seven Year 9 students (13-14 years) from five high schools in Queensland, Australia, completed a questionnaire about their perceptions of risk and recent injury experience. Self-reported injuries were assessed by the Extended Adolescent Injury Checklist (E-AIC). Injuries resulting from motorcycle use, bicycle use, vehicle use (as passenger or driver), and as a pedestrian were measured for the preceding three months. Students were also asked to indicate whether they sought medical attention for their injuries. Truancy rates were assessed from self-reported skipping class or wagging school over the same three month period. The findings explore the relationship between early adolescent truancy and road related injuries. The relationship between road related injuries and truancy was analysed separately for males and females. Results of this study revealed that road related injuries and reports of associated medical treatment are higher for young people who engage in truancy when compared with non-truant adolescents. The results of this study contribute knowledge about truancy as a risk factor for engagement in road related risks. The findings have the potential to enhance school policies and injury prevention programs if emphasis is placed on increasing school attendance as a safety measure to decrease road related injuries for young adolescents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Studies on the relationship between performance and design of the throwing frame have been limited. Part I provided only a description of the whole body positioning. Objectives: The specific objectives were (a) to benchmark feet positioning characteristics (i.e. position, spacing and orientation) and (b) to investigate the relationship between performance and these characteristics for male seated discus throwers in F30s classes. Study Design: Descriptive analysis. Methods: A total of 48 attempts performed by 12 stationary discus throwers in F33 and F34 classes during seated discus throwing event of 2002 International Paralympic Committee Athletics World Championships were analysed in this study. Feet positioning was characterised by tridimensional data of the front and back feet position as well as spacing and orientation corresponding to the distance between and the angle made by both feet, respectively. Results: Only 4 of 30 feet positioning characteristics presented a coefficient correlation superior to 0.5, including the feet spacing on mediolateral and anteroposterior axes in F34 class as well as the back foot position and feet spacing on mediolateral axis in F33 class. Conclusions: This study provided key information for a better understanding of the interaction between throwing technique of elite seated throwers and their throwing frame.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To assess the effects of pre-cooling volume on neuromuscular function and performance in free-paced intermittent-sprint exercise in the heat. Methods: Ten male, teamsport athletes completed four randomized trials involving an 85-min free-paced intermittentsprint exercise protocol in 33°C±33% relative humidity. Pre-cooling sessions included whole body (WB), head+hand (HH), head (H) and no cooling (CONT), applied for 20-min pre-exercise and 5-min mid exercise. Maximal voluntary contractions (MVC) were assessed pre- and postintervention and mid- and post-exercise. Exercise performance was assessed with sprint times, % decline and distances covered during free-paced bouts. Measures of core(Tc) and skin (Tsk) temperatures, heart rate, perceptual exertion and thermal stress were monitored throughout. Venous and capillary blood was analyzed for metabolite, muscle damage and inflammatory markers. Results: WB pre-cooling facilitated the maintenance of sprint times during the exercise protocol with reduced % decline (P=0.04). Mean and total hard running distances increased with pre cooling 12% compared to CONT (P<0.05), specifically, WB was 6-7% greater than HH (P=0.02) and H (P=0.001) respectively. No change was evident in mean voluntary or evoked force pre- to post-exercise with WB and HH cooling (P>0.05). WB and HH cooling reduced Tc by 0.1-0.3°C compared to other conditions (P<0.05). WB Tsk was suppressed for the entire session(P=0.001). HR responses following WB cooling were reduced(P=0.05; d=1.07) compared to CONT conditions during exercise. Conclusion: A relationship between pre-cooling volume and exercise performance seems apparent, as larger surface area coverage augmented subsequent free-paced exercise capacity, in conjunction with greater suppression of physiological load. Maintenance of MVC with pre-cooling, despite increased work output suggests the role of centrally-mediated mechanisms in exercise pacing regulation and subsequent performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examined the effects of pre-cooling duration on performance and neuromuscular function for self-paced intermittent-sprint shuttle running in the heat. Eight male, team-sport athletes completed two 35-min bouts of intermittent-sprint shuttle running separated by a 15-min recovery on three separate occasions (33°C, 34% relative humidity). Mixed-method pre-cooling was completed for 20 min (COOL20), 10-min (COOL10) or no cooling (CONT) and reapplied for 5-min mid-exercise. Performance was assessed via sprint times, percentage decline and shuttle-running distance covered. Maximal voluntary contractions (MVC), voluntary activation (VA) and evoked twitch properties were recorded pre- and post-intervention and mid- and post-exercise. Core temperature (T c), skin temperature, heart rate, capillary blood metabolites, sweat losses, perceptual exertion and thermal stress were monitored throughout. Venous blood draws pre- and post-exercise were analyzed for muscle damage and inflammation markers. Shuttle-running distances covered were increased 5.2 ± 3.3% following COOL20 (P < 0.05), with no differences observed between COOL10 and CONT (P > 0.05). COOL20 aided in the maintenance of mid- and post-exercise MVC (P < 0.05; d > 0.80), despite no conditional differences in VA (P > 0.05). Pre-exercise T c was reduced by 0.15 ± 0.13°C with COOL20 (P < 0.05; d > 1.10), and remained lower throughout both COOL20 and COOL10 compared to CONT (P < 0.05; d > 0.80). Pre-cooling reduced sweat losses by 0.4 ± 0.3 kg (P < 0.02; d > 1.15), with COOL20 0.2 ± 0.4 kg less than COOL10 (P = 0.19; d = 1.01). Increased pre-cooling duration lowered physiological demands during exercise heat stress and facilitated the maintenance of self-paced intermittent-sprint performance in the heat. Importantly, the dose-response interaction of pre-cooling and sustained neuromuscular responses may explain the improved exercise performance in hot conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Road traffic crashes have emerged as a major health problem around the world. Road crash fatalities and injuries have been reduced significantly in developed countries, but they are still an issue in low and middle-income countries. The World Health Organization (WHO, 2009) estimates that the death toll from road crashes in low- and middle-income nations is more than 1 million people per year, or about 90% of the global road toll, even though these countries only account for 48% of the world's vehicles. Furthermore, it is estimated that approximately 265,000 people die every year in road crashes in South Asian countries and Pakistan stands out with 41,494 approximately deaths per year. Pakistan has the highest rate of fatalities per 100,000 population in the region and its road crash fatality rate of 25.3 per 100,000 population is more than three times that of Australia's. High numbers of road crashes not only cause pain and suffering to the population at large, but are also a serious drain on the country's economy, which Pakistan can ill-afford. Most studies identify human factors as the main set of contributing factors to road crashes, well ahead of road environment and vehicle factors. In developing countries especially, attention and resources are required in order to improve things such as vehicle roadworthiness and poor road infrastructure. However, attention to human factors is also critical. Human factors which contribute to crashes include high risk behaviours like speeding and drink driving, and neglect of protective behaviours such as helmet wearing and seat belt wearing. Much research has been devoted to the attitudes, beliefs and perceptions which contribute to these behaviours and omissions, in order to develop interventions aimed at increasing safer road use behaviours and thereby reducing crashes. However, less progress has been made in addressing human factors contributing to crashes in developing countries as compared to the many improvements in road environments and vehicle standards, and this is especially true of fatalistic beliefs and behaviours. This is a significant omission, since in different cultures in developing countries there are strong worldviews in which predestination persists as a central idea, i.e. that one's life (and death) and other events have been mapped out and are predetermined. Fatalism refers to a particular way in which people regard the events that occur in their lives, usually expressed as a belief that an individual does not have personal control over circumstances and that their lives are determined through a divine or powerful external agency (Hazen & Ehiri, 2006). These views are at odds with the dominant themes of modern health promotion movements, and present significant challenges for health advocates who aim to avert road crashes and diminish their consequences. The limited literature on fatalism reveals that it is not a simple concept, with religion, culture, superstition, experience, education and degree of perceived control of one's life all being implicated in accounts of fatalism. One distinction in the literature that seems promising is the distinction between empirical and theological fatalism, although there are areas of uncertainty about how well-defined the distinction between these types of fatalism is. Research into road safety in Pakistan is scarce, as is the case for other South Asian countries. From the review of the literature conducted, it is clear that the descriptions given of the different belief systems in developing countries including Pakistan are not entirely helpful for health promotion purposes and that further research is warranted on the influence of fatalism, superstition and other related beliefs in road safety. Based on the information available, a conceptual framework is developed as a means of structuring and focusing the research and analysis. The framework is focused on the influence of fatalism, superstition, religion and culture on beliefs about crashes and road user behaviour. Accordingly, this research aims to provide an understanding of the operation of fatalism and related beliefs in Pakistan to assist in the development and implementation of effective and culturally appropriate interventions. The research examines the influence of fatalism, superstition, religious and cultural beliefs on risky road use in Pakistan and is guided by three research questions: 1. What are the perceptions of road crash causation in Pakistan, in particular the role of fatalism, superstition, religious and cultural beliefs? 2. How does fatalism, superstition, and religious and cultural beliefs influence road user behaviour in Pakistan? 3. Do fatalism, superstition, and religious and cultural beliefs work as obstacles to road safety interventions in Pakistan? To address these questions, a qualitative research methodology was developed. The research focused on gathering data through individual in-depth interviewing using a semi-structured interview format. A sample of 30 participants was interviewed in Pakistan in the cities of Lahore, Rawalpindi and Islamabad. The participants included policy makers (with responsibility for traffic law), experienced police officers, religious orators, professional drivers (truck, bus and taxi) and general drivers selected through a combination of purposive, criterion and snowball sampling. The transcripts were translated from Urdu and analysed using a thematic analysis approach guided by the conceptual framework. The findings were divided into four areas: attribution of crash causation to fatalism; attribution of road crashes to beliefs about superstition and malicious acts; beliefs about road crash causation linked to popular concepts of religion; and implications for behaviour, safety and enforcement. Fatalism was almost universally evident, and expressed in a number of ways. Fate was used to rationalise fatal crashes using the argument that the people killed were destined to die that day, one way or another. Related to this was the sense of either not being fully in control of the vehicle, or not needing to take safety precautions, because crashes were predestined anyway. A variety of superstitious-based crash attributions and coping methods to deal with road crashes were also found, such as belief in the role of the evil eye in contributing to road crashes and the use of black magic by rivals or enemies as a crash cause. There were also beliefs related to popular conceptions of religion, such as the role of crashes as a test of life or a source of martyrdom. However, superstitions did not appear to be an alternative to religious beliefs. Fate appeared as the 'default attribution' for a crash when all other explanations failed to account for the incident. This pervasive belief was utilised to justify risky road use behaviour and to resist messages about preventive measures. There was a strong religious underpinning to the statement of fatalistic beliefs (this reflects popular conceptions of Islam rather than scholarly interpretations), but also an overlap with superstitious and other culturally and religious-based beliefs which have longer-standing roots in Pakistani culture. A particular issue which is explored in more detail is the way in which these beliefs and their interpretation within Pakistani society contributed to poor police reporting of crashes. The pervasive nature of fatalistic beliefs in Pakistan affects road user behaviour by supporting continued risk taking behaviour on the road, and by interfering with public health messages about behaviours which would reduce the risk of traffic crashes. The widespread influence of these beliefs on the ways that people respond to traffic crashes and the death of family members contribute to low crash reporting rates and to a system which appears difficult to change. Fate also appeared to be a major contributing factor to non-reporting of road crashes. There also appeared to be a relationship between police enforcement and (lack of) awareness of road rules. It also appears likely that beliefs can influence police work, especially in the case of road crash investigation and the development of strategies. It is anticipated that the findings could be used as a blueprint for the design of interventions aimed at influencing broad-spectrum health attitudes and practices among the communities where fatalism is prevalent. The findings have also identified aspects of beliefs that have complex social implications when designing and piloting driver intervention strategies. By understanding attitudes and behaviours related to fatalism, superstition and other related concepts, it should be possible to improve the education of general road users, such that they are less likely to attribute road crashes to chance, fate, or superstition. This study also underscores the understanding of this issue in high echelons of society (e.g., policy makers, senior police officers) as their role is vital in dispelling road users' misconceptions about the risks of road crashes. The promotion of an evidence or scientifically-based approach to road user behaviour and road safety is recommended, along with improved professional education for police and policy makers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since March 2010 in Queensland, legislation has specified the type of restraint and seating row for child passengers under 7 years according to age. The following study explored regional parents’ child restraint practices and the influence of their health beliefs over these. A brief intercept interview was verbally administered to a convenience sample of parent-drivers (n = 123) in Toowoomba in February 2010, after the announcement of changes to legislation but prior to enforcement. Parents who agreed to be followed-up were then reinterviewed after the enforcement (May-June 2010). The Health Beliefs Model was used to gauge beliefs about susceptibility to crashing, children being injured in a crash, and likely severity of injuries. Self-efficacy and perceptions about barriers to, and benefits of, using age-appropriate restraints with children, were also assessed. Results: There were very high levels of rear seating reported for children (initial interview 91%; follow-up 100%). Dedicated child restraint use was 96.9% at initial interview, though 11% were deemed inappropriate for the child’s age. Self-reported restraint practices for children under 7 were used to categorise parental practices into ‘Appropriate’ (all children in age-appropriate restraint and rear seat) or ‘Inappropriate’ (≥1 child inappropriately restrained). 94% of parents were aware of the legislation, but only around one third gave accurate descriptions of the requirements. However, 89% of parents were deemed to have ‘Appropriate’ restraint practices. Parents with ‘Inappropriate’ practices were significantly more likely than those with ‘Appropriate’ practices to disagree that child restraints provide better protection for children in a crash than adult seatbelts. For self-efficacy, parents with ‘Appropriate’ practices were more likely than those with ‘Inappropriate’ practices to report being ‘completely confident’ about installing child restraints. The results suggest that efforts to increase the level of appropriate restraint should attempt to better inform them about the superior protection offered by child restraints compared with seat belts for children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite a considerable amount of research on traffic injury severities, relatively little is known about the factors influencing traffic injury severity in developing countries, and in particular in Bangladesh. Road traffic crashes are a common headline in daily newspapers of Bangladesh. It has also recorded one of the highest road fatality rates in the world. This research identifies significant factors contributing to traffic injury severity in Dhaka – a mega city and capital of Bangladesh. Road traffic crash data of 5 years from 2007 to 2011 were collected from the Dhaka Metropolitan Police (DMP), which included about 2714 traffic crashes. The severity level of these crashes was documented in a 4-point ordinal scale: no injury (property damage), minor injury, severe injury, and death. An ordered Probit regression model has been estimated to identify factors contributing to injury severities. Results show that night time influence is associated with a higher level injury severity as is for individuals involved in single vehicle crashes. Crashes on highway sections within the city are found to be more injurious than crashes along the arterial and feeder roads. There is a lower likelihood of injury severity, however, if the road sections are monitored and enforced by the traffic police. The likelihood of injuries is lower on two-way traffic arrangements than one-way, and at four-legged intersections and roundabouts compare to road segments. The findings are compared with those from developed countries and the implications of this research are discussed in terms of policy settings for developing countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Paramedics play an important role in out-of-hospital health care. They provide unscheduled care, assisting both patients with minor injuries and those experiencing life-threatening emergencies. Increasingly, paramedics are called on to manage chronic and complex health needs, including symptom relief for patients at the end of life. However, paramedics may not be well prepared to offer palliative care, as practice guidelines and education tend to focus on the management of acute medical emergencies and major trauma. Emergency medical services that employ paramedics rarely have practice guidelines or protocols that deal specifically with palliative care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bovine colostrum has been shown to influence the cytokine production of bovine leukocytes. However, it remains unknown whether processed bovine colostrum, a supplement popular among athletes to enhance immune function, is able to modulate cytokine secretion of human lymphocytes and monocytes. The aim of this investigation was to determine the influence of a commercially available bovine colostrum protein concentrate (CPC) to stimulate cytokine production by human peripheral blood mononuclear cells (PBMCs). Blood was sampled from four healthy male endurance athletes who had abstained from exercise for 48 h. PBMCs were separated and cultured with bovine CPC concentrations of 0 (control), 1.25, 2.5, and 5% with and without lipopolysaccharide (LPS) (3 microg/mL) and phytohemagglutinin (PHA) (2.5 microg/mL). Cell supernatants were collected at 6 and 24 h of culture for the determination of tumor necrosis factor (TNF), interferon (IFN)-gamma, interleukin (IL)-10, IL-6, IL-4, and IL-2 concentrations. Bovine CPC significantly stimulated the release of IFN-gamma, IL-10, and IL-2 (p < 0.03). The addition of LPS to PBMCs cocultured with bovine CPC significantly stimulated the release of IL-2 and inhibited the early release of TNF, IL-6, and IL-4 (p < 0.02). Phytohemagglutinin stimulation in combination with bovine CPC significantly increased the secretion of IL-10 and IL-2 at 6 h of culture and inhibited IFN-gamma and TNF (p < 0.05). This data show that a commercial bovine CPC is able to modulate in vitro cytokine production of human PBMCs. Alterations in cytokine secretion may be a potential mechanism for reported benefits associated with supplementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives The relationship between performance variability and accuracy in cricket fast bowlers of different skill levels under three different task conditions was investigated. Bowlers of different skill levels were examined to observe if they could adapt movement patterns to maintain performance accuracy on a bowling skills test. Design 8 national, 12 emerging and 12 junior pace bowlers completed an adapted version of the Cricket Australia bowling skills test, in which they performed 30 trials involving short (n = 10), good (n = 10), and full (n = 10) length deliveries. Methods Bowling accuracy was recorded by digitising ball position relative to the centre of a target. Performance measures were mean radial error (accuracy), variable error (consistency), centroid error (bias), bowling score and ball speed. Radial error changes across the duration of the skills test were used to record accuracy adjustment in subsequent deliveries. Results Elite fast bowlers performed better in speed, accuracy, and test scores than developing athletes. Bowlers who were less variable were also more accurate across all delivery lengths. National and emerging bowlers were able to adapt subsequent performance trials within the same bowling session for short length deliveries. Conclusions Accuracy and adaptive variability were key components of elite performance in fast bowling which improved with skill level. In this study, only national elite bowlers showed requisite levels of adaptive variability to bowl a range of lengths to different pitch locations.