921 resultados para Trail-following


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnosis threat is a psychosocial factor that has been proposed to contribute to poor outcomes following mild traumatic brain injury (mTBI). This threat is thought to impair the cognitive test performance of individuals with mTBI because of negative injury stereotypes. University students (N= 45, 62.2% female) with a history of mTBI were randomly allocated to a diagnosis threat (DT, n=15), reduced threat (DT-reduced, n=15) or neutral (n=15) group. The reduced threat condition invoked a positive stereotype (i.e., that people with mTBI can perform well on cognitive tests). All participants were given neutral instructions before they completed baseline tests of: a) objective cognitive function across a number of domains; b) psychological symptoms; and, c) PCS symptoms, including self-reported cognitive and emotional difficulties. Participants then received either neutral, DT or DT-reduced instructions, before repeating the tests. Results were analyzed using separate mixed model ANOVAs; one for each dependent measure. The only significant result was for the 2 X 3 ANOVA on an objective test of attention/working memory, Digit Span, p<.05, such that the DT-reduced group performed better than the other groups, which were not different from each other. Although not consistent with predictions or earlier DT studies, the absence of group differences on most tests fits with several recent DT findings. The results of this study suggest that it is timely to reconsider the role of DT as a unique contributor to poor mTBI outcome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Acute respiratory illness, a leading cause of cough in children, accounts for a substantial proportion of childhood morbidity and mortality worldwide. In some children acute cough progresses to chronic cough (> 4 weeks duration), impacting on morbidity and decreasing quality of life. Despite the importance of chronic cough as a cause of substantial childhood morbidity and associated economic, family and social costs, data on the prevalence, predictors, aetiology and natural history of the symptom are scarce. This study aims to comprehensively describe the epidemiology, aetiology and outcomes of cough during and after acute respiratory illness in children presenting to a tertiary paediatric emergency department. Methods/design A prospective cohort study of children aged <15 years attending the Royal Children's Hospital Emergency Department, Brisbane, for a respiratory illness that includes parent reported cough (wet or dry) as a symptom. The primary objective is to determine the prevalence and predictors of chronic cough (>= 4 weeks duration) post presentation with acute respiratory illness. Demographic, epidemiological, risk factor, microbiological and clinical data are completed at enrolment. Subjects complete daily cough dairies and weekly follow-up contacts for 28(+/-3) days to ascertain cough persistence. Children who continue to cough for 28 days post enrolment are referred to a paediatric respiratory physician for review. Primary analysis will be the proportion of children with persistent cough at day 28(+/-3). Multivariate analyses will be performed to evaluate variables independently associated with chronic cough at day 28(+/-3). Discussion Our protocol will be the first to comprehensively describe the natural history, epidemiology, aetiology and outcomes of cough during and after acute respiratory illness in children. The results will contribute to studies leading to the development of evidence-based clinical guidelines to improve the early detection and management of chronic cough in children during and after acute respiratory illness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Australian Indigenous children are the only population worldwide to receive the 7-valent pneumococcal conjugate vaccine (7vPCV) at 2, 4, and 6 months of age and the 23-valent pneumococcal polysaccharide vaccine (23vPPV) at 18 months of age. We evaluated this program's effectiveness in reducing the risk of hospitalization for acute lower respiratory tract infection (ALRI) in Northern Territory (NT) Indigenous children aged 5-23 months. Methods We conducted a retrospective cohort study involving all NT Indigenous children born from 1 April 2000 through 31 October 2004. Person-time at-risk after 0, 1, 2, and 3 doses of 7vPCV and after 0 and 1 dose of 23vPPV and the number of ALRI following each dose were used to calculate dose-specific rates of ALRI for children 5-23 months of age. Rates were compared using Cox proportional hazards models, with the number of doses of each vaccine serving as time-dependent covariates. Results There were 5482 children and 8315 child-years at risk, with 2174 episodes of ALRI requiring hospitalization (overall incidence, 261 episodes per 1000 child-years at risk). Elevated risk of ALRI requiring hospitalization was observed after each dose of the 7vPCV vaccine, compared with that for children who received no doses, and an even greater elevation in risk was observed after each dose of the 23vPPV ( adjusted hazard ratio [HR] vs no dose, 1.39; 95% confidence interval [CI], 1.12-1.71;). Risk was highest among children Pp. 002 vaccinated with the 23vPPV who had received < 3 doses of the 7vPCV (adjusted HR, 1.81; 95% CI, 1.32-2.48). Conclusions Our results suggest an increased risk of ALRI requiring hospitalization after pneumococcal vaccination, particularly after receipt of the 23vPPV booster. The use of the 23vPPV booster should be reevaluated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: We present and analyze long-term outcomes following multimodal therapy for esophageal cancer, in particular the relative impact of histomorphologic tumor regression and nodal status. PATIENTS AND METHODS: A total of 243 patients [(adenocarcinoma (n = 170) and squamous cell carcinoma (n = 73)] treated with neoadjuvant chemoradiotherapy in the period 1990 to 2004 were followed prospectively with a median follow-up of 60 months. Pathologic stage and tumor regression grade (TRG) were documented, the site of first failure was recorded, and Kaplan-Meier survival curves were plotted. RESULTS: Thirty patients (12%) did not undergo surgery due to disease progression or deteriorated performance status. Forty-one patients (19%) had a complete pathologic response (pCR), and there were 31(15%) stage I, 69 (32%) stage II, and 72 (34%) stage III cases. The overall median survival was 18 months, and the 5-year survival was 27%. The 5-year survival of patients achieving a pCR was 50% compared with 37% in non-pCR patients who were node-negative (P = 0.86). Histomorphologic tumor regression was not associated with pre-CRT cTN stage but was significantly (P < 0.05) associated with ypN stage. By multivariate analysis, ypN status (P = 0.002) was more predictive of overall survival than TRG (P = 0.06) or ypT stage (P = 0.39). CONCLUSION: Achieving a node-negative status is the major determinant of outcome following neoadjuvant chemoradiotherapy. Histomorphologic tumor regression is less predictive of outcome than pathologic nodal status (ypN), and the need to include a primary site regression score in a new staging classification is unclear. © 2007 Lippincott Williams & Wilkins, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background. This study evaluated the time course of recovery of transverse strain in the Achilles and patellar tendons following a bout of resistance exercise. Methods. Seventeen healthy adults underwent sonographic examination of the right patellar (n = 9) or Achilles (n = 8) tendons immediately prior to and following 90 repetitions of weight–bearing exercise. Quadriceps and gastrocnemius exercise were performed against an effective resistance of 175% and 250% body weight, respectively. Sagittal tendon thickness was determined 20 mm from the tendon enthesis and transverse strain was repeatedly monitored over a 24 hour recovery period. Results. Resistance exercise resulted in an immediate decrease in Achilles (t7 = 10.6, P<.01) and patellar (t8 = 8.9, P<.01) tendon thickness, resulting in an average transverse strain of 0.14 ± 0.04 and 0.18 ± 0.05. While the average strain was not significantly different between tendons, older age was associated with a reduced transverse strain response (r=0.63, P<.01). Recovery of transverse strain, in contrast, was prolonged compared with the duration of loading and exponential in nature. The mean primary recovery time was not significantly different between Achilles (6.5 ± 3.2 hours) and patellar (7.1 ± 3.2 hours) tendons and body weight accounted for 62% and 64% of the variation in recovery time, respectively. Discussion. Despite structural and biochemical differences between the Achilles and patellar tendons [1], the mechanisms underlying transverse creep–recovery in vivo appear similar and are highly time dependent. Primary recovery required about 7 hours in healthy tendons, with full recovery requiring up to 24 hours. These in vivo recovery times are similar to those reported for axial creep recovery of the vertebral disc in vitro [2], and may be used clinically to guide physical activity to rest ratios in healthy adults. Optimal ratios for high–stress tendons in clinical populations, however, remain unknown and require further attention in light of the knowledge gained in this study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Ondansetron is a 5-HT3 receptor antagonist commonly used as an anti-emetic to prevent the nausea and vomiting associated with anti-cancer drugs, cancer radiotherapy, or postoperatively. Recently, the US Food and Drug Administration (FDA) issued a warning for ondansetron due to a potential for prolongation of the QT interval of the electrocardiogram (ECG), a phenomenon that is associated with an increased risk of the potentially fatal arrhythmia torsade de pointes. Areas covered: We undertook a review of the cardiac safety of ondansetron. Our primary sources of information were PubMed (with downloading of full articles), and the internet. Expert opinion: The dose of ondansetron that the FDA has concerns about is 32 mg iv (or several doses that are equivalent to this), which is only used in preventing nausea and vomiting associated with cancer chemotherapy. This suggests that ondansetron may be safe in the lower doses used to prevent the nausea and vomiting in radiation treatment or postoperatively. However, as there is a report that a lower dose of ondansetron prolonged the QT interval in healthy volunteers, this needs to be clarified by the FDA. More research needs to be undertaken of the relationship between QT prolongation and torsades in order that the FDA can produce clear-cut evidence of pro-arrhythmic risk when introducing warnings for this.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Low circulating folate concentrations lead to elevations of plasma homocysteine. Even mild elevations of plasma homocysteine are associated with significantly increased risk of cardiovascular disease (CVD). Available evidence suggests that poor nutrition contributes to excessive premature CVD mortality in Australian Aboriginal people. The aim of the present study was to examine the effect of a nutrition intervention program conducted in an Aboriginal community on plasma homocysteine concentrations in a community-based cohort. From 1989, a health and nutrition project was developed, implemented and evaluated with the people of a remote Aboriginal community. Plasma homocysteine concentrations were measured in a community-based cohort of 14 men and 21 women screened at baseline, 6 months and 12 months. From baseline to 6 months there was a fall in mean plasma homocysteine of over 2|mol/L (P = 0.006) but no further change thereafter (P = 0.433). These changes were associated with a significant increase in red cell folate concentration from baseline to 6 months (P < 0.001) and a further increase from 6 to 12 months (P < 0.001). In multiple regression analysis, change in homocysteine concentration from baseline to 6 months was predicted by change in red cell folate (P = 0.002) and baseline homocysteine (P < 0.001) concentrations, but not by age, gender or baseline red cell folate concentration. We conclude that modest improvements in dietary quality among populations with poor nutrition (and limited disposable income) can lead to reductions in CVD risk.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cryotherapy is currently used in various clinical, rehabilitative, and sporting settings. However, very little is known regarding the impact of cooling on the microcirculatory response. Objectives: The present study sought to examine the influence of two commonly employed modalities of cryotherapy, whole body cryotherapy (WBC; -110°C) and cold water immersion(CWI; 8±1°C), on skin microcirculation in the mid- thigh region. Methods: The skin area examined was a 3 × 3 cm located between the most anterior aspect of the inguinal fold and the patella. Following 10 minutes of rest, 5 healthy, active males were exposed to either WBC for 3 minutes or CWI for 5 minutes in a randomised order. Volunteers lay supine for five minutes after treatment, in order to monitor the variation of red blood cell (RBC) concentration in the region of interest for a duration of 40 minutes. Microcirculation response was assessed using a non-invasive, portable instrument known as a Tissue Viability imaging system. After a minimum of seven days, the protocol was repeated. Subjective assessment of the volunteer’s thermal comfort and thermal sensation was also recorded. Results: RBC was altered following exposure to both WBC and CWI but appeared to stabilise approximately 35 minutes after treatments. Both WBC and CWI affected thermal sensation (p < 0.05); however no betweengroup differences in thermal comfort or sensation were recorded (p > 0.05). Conclusions: As both WBC and CWI altered RBC, further study is necessary to examine the mechanism for this alteration during whole body cooling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prolonged intermittent-sprint exercise (i.e., team sports) induce disturbances in skeletal muscle structure and function that are associated with reduced contractile function, a cascade of inflammatory responses, perceptual soreness, and a delayed return to optimal physical performance. In this context, recovery from exercise-induced fatigue is traditionally treated from a peripheral viewpoint, with the regeneration of muscle physiology and other peripheral factors the target of recovery strategies. The direction of this research narrative on post-exercise recovery differs to the increasing emphasis on the complex interaction between both central and peripheral factors regulating exercise intensity during exercise performance. Given the role of the central nervous system (CNS) in motor-unit recruitment during exercise, it too may have an integral role in post-exercise recovery. Indeed, this hypothesis is indirectly supported by an apparent disconnect in time-course changes in physiological and biochemical markers resultant from exercise and the ensuing recovery of exercise performance. Equally, improvements in perceptual recovery, even withstanding the physiological state of recovery, may interact with both feed-forward/feed-back mechanisms to influence subsequent efforts. Considering the research interest afforded to recovery methodologies designed to hasten the return of homeostasis within the muscle, the limited focus on contributors to post-exercise recovery from CNS origins is somewhat surprising. Based on this context, the current review aims to outline the potential contributions of the brain to performance recovery after strenuous exercise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Silver dressings have been widely and successfully used to prevent cutaneous wounds, including burns, chronic ulcers, dermatitis and other cutaneous conditions, from infection. However, in a few cases, skin discolouration or argyria-like appearances have been reported. This study investigated the level of silver in scar tissue post-burn injury following application of Acticoat, a silver dressing. METHODS A porcine deep dermal partial thickness burn model was used. Burn wounds were treated with this silver dressing until completion of re-epithelialization, and silver levels were measured in a total of 160 scars and normal tissues. RESULTS The mean level of silver in scar tissue covered with silver dressings was 136 microg/g, while the silver level in normal skin was less than 0.747 microg/g. A number of wounds had a slate-grey appearance, and dissection of the scars revealed brown-black pigment mostly in the middle and deep dermis within the scar. The level of silver and the severity of the slate-grey discolouration were correlated with the length of time of the silver dressing application. CONCLUSIONS These results show that silver deposition in cutaneous scar tissue is a common phenomenon, and higher levels of silver deposits and severe skin discolouration are correlated with an increase in the duration of this silver dressing application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to explore whether participants' pretherapy coping strategies predicted the outcome of group cognitive behavioral therapy (CBT) for anxiety and depression. It was hypothesized that adaptive coping strategies such as the use of active planning and acceptance would be associated with higher reductions, whereas maladaptive coping strategies such as denial and disengagement would be associated with lower reductions in anxious and depressed symptoms following psychotherapy. There were 144 participants who completed group CBT for anxiety and depression. Measures of coping strategies were administered prior to therapy, whereas measures of depression and anxiety were completed both prior to and following therapy. The results showed that higher levels of denial were associated with a poorer outcome, in terms of change in anxiety but not depression, following therapy. These findings suggest the usefulness of using the Denial subscale from the revised Coping Orientation to Problems Experienced (COPE) as a predictor of outcome in group CBT for anxiety.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction The culture in many team sports involves consumption of large amounts of alcohol after training/competition. The effect of such a practice on recovery processes underlying protein turnover in human skeletal muscle are unknown. We determined the effect of alcohol intake on rates of myofibrillar protein synthesis (MPS) following strenuous exercise with carbohydrate (CHO) or protein ingestion. Methods In a randomized cross-over design, 8 physically active males completed three experimental trials comprising resistance exercise (8×5 reps leg extension, 80% 1 repetition maximum) followed by continuous (30 min, 63% peak power output (PPO)) and high intensity interval (10×30 s, 110% PPO) cycling. Immediately, and 4 h post-exercise, subjects consumed either 500 mL of whey protein (25 g; PRO), alcohol (1.5 g·kg body mass−1, 12±2 standard drinks) co-ingested with protein (ALC-PRO), or an energy-matched quantity of carbohydrate also with alcohol (25 g maltodextrin; ALC-CHO). Subjects also consumed a CHO meal (1.5 g CHO·kg body mass−1) 2 h post-exercise. Muscle biopsies were taken at rest, 2 and 8 h post-exercise. Results Blood alcohol concentration was elevated above baseline with ALC-CHO and ALC-PRO throughout recovery (P<0.05). Phosphorylation of mTORSer2448 2 h after exercise was higher with PRO compared to ALC-PRO and ALC-CHO (P<0.05), while p70S6K phosphorylation was higher 2 h post-exercise with ALC-PRO and PRO compared to ALC-CHO (P<0.05). Rates of MPS increased above rest for all conditions (~29–109%, P<0.05). However, compared to PRO, there was a hierarchical reduction in MPS with ALC-PRO (24%, P<0.05) and with ALC-CHO (37%, P<0.05). Conclusion We provide novel data demonstrating that alcohol consumption reduces rates of MPS following a bout of concurrent exercise, even when co-ingested with protein. We conclude that alcohol ingestion suppresses the anabolic response in skeletal muscle and may therefore impair recovery and adaptation to training and/or subsequent performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The following research reports the emergence of Leptospira borgpetersenii serovar Arborea as the dominant infecting serovar following the summer of disasters and the ensuing clean up in Queensland, Australia during 2011. For the 12 month period (1 January to 31 December) L. borgpetersenii serovar Arborea accounted for over 49% of infections. In response to a flooding event public health officials need to issue community wide announcements warning the population about the dangers of leptospirosis and other water borne diseases. Communication with physicians working in the affected community should also be increased to update physicians with information such as clinical presentation of leptospirosis and other waterborne diseases. These recommendations will furnish public health officials with considerations for disease management when dealing with future disaster management programs.