320 resultados para Car following.
Resumo:
Background. As a society, our interaction with the environment is having a negative impact on human health. For example, an increase in car use for short trips, over walking or cycling, has contributed to an increase in obesity, diabetes and poor heart health and also contributes to pollution, which is associated with asthma and other respiratory diseases. In order to change the nature of that interaction, to be more positive and healthy, it is recommended that individuals adopt a range of environmentally friendly behaviours (such as walking for transport and reducing the use of plastics). Effective interventions aimed at increasing such behaviours will need to be evidence based and there is a need for the rapid communication of information from the point of research, into policy and practice. Further, a number of health disciplines, including psychology and public health, share a common mission to promote health and well-being. Therefore, the objective of this project is to take a cross-discipline and collaborative approach to reveal psychological mechanisms driving environmentally friendly behaviour. This objective is further divided into three broad aims, the first of which is to take a cross-discipline and collaborative approach to research. The second aim is to explore and identify the salient beliefs which most strongly predict environmentally friendly behaviour. The third aim is to build an augmented model to explain environmentally friendly behaviour. The thesis builds on the understanding that an interdisciplinary collaborative approach will facilitate the rapid transfer of knowledge to inform behaviour change interventions. Methods. The application of this approach involved two surveys which explored the psycho-social predictors of environmentally friendly behaviour. Following a qualitative pilot study, and in collaboration with an expert panel comprising academics, industry professionals and government representatives, a self-administered, Theory of Planned Behaviour (TPB) based, mail survey was distributed to a random sample of 3000 residents of Brisbane and Moreton Bay Region (Queensland, Australia). This survey explored specific beliefs including attitudes, norms, perceived control, intention and behaviour, as well as environmental altruism and green identity, in relation to walking for transport and switching off lights when not in use. Following analysis of the mail survey data and based on feedback from participants and key stakeholders, an internet survey was employed (N=451) to explore two additional behaviours, switching off appliances at the wall when not in use, and shopping with reusable bags. This work is presented as a series of interrelated publications which address each of the research aims. Presentation of Findings. Chapter five of this thesis consists of a published paper which addresses the first aim of the research and outlines the collaborative and multidisciplinary approach employed in the mail survey. The paper argued that forging alliances with those who are in a position to immediately utilise the findings of research has the potential to improve the quality and timely communication of research. Illustrating this timely communication, Chapter six comprises a report presented to Moreton Bay Regional Council (MBRC). This report addresses aim's one and two. The report contains a summary of participation in a range of environmentally friendly behaviours and identifies the beliefs which most strongly predicted walking for transport and switching off lights (from the mail survey). These salient beliefs were then recommended as targets for interventions and included: participants believing that they might save money; that their neighbours also switch off lights; that it would be inconvenient to walk for transport and that their closest friend also walks for transport. Chapter seven also addresses the second aim and presents a published conference paper in which the salient beliefs predicting the four specified behaviours (from both surveys) are identified and potential applications for intervention are discussed. Again, a range of TPB based beliefs, including descriptive normative beliefs, were predictive of environmentally friendly behaviour. This paper was also provided to MBRC, along with recommendations for applying the findings. For example, as descriptive normative beliefs were consistently correlated with environmentally friendly behaviour, local councils could engage in marketing and interventions (workshops, letter box drops, internet promotions) which encourage parents and friends to model, rather than simply encourage, environmentally friendly behaviour. The final two papers, presented in Chapters eight and nine, addresses the third aim of the project. These papers each present two behaviours together to inform a TPB based theoretical model with which to predict environmentally friendly behaviour. A generalised model is presented, which is found to predict the four specific behaviours under investigation. The role of demographics was explored across each of the behaviour specific models. It was found that some behaviour's differ by age, gender, income or education. In particular, adjusted models predicted more of the variance in walking for transport amongst younger participants and females. Adjusted models predicted more variance in switching off lights amongst those with a bachelor degree or higher and predicted more variance in switching off appliances amongst those on a higher income. Adjusted models predicted more variance in shopping with reusable bags for males, people 40 years or older, those on a higher income and those with a bachelor degree or higher. However, model structure and general predictability was relatively consistent overall. The models provide a general theoretical framework from which to better understand the motives and predictors of environmentally friendly behaviour. Conclusion. This research has provided an example of the benefits of a collaborative interdisciplinary approach. It has identified a number of salient beliefs which can be targeted for social marketing campaigns and educational initiatives; and these findings, along with recommendations, have been passed on to a local council to be used as part of their ongoing community engagement programs. Finally, the research has informed a practical model, as well as behaviour specific models, for predicting sustainable living behaviours. Such models can highlight important core constructs from which targeted interventions can be designed. Therefore, this research represents an important step in undertaking collaborative approaches to improving population health through human-environment interactions.
Resumo:
Individual variability in the acquisition, consolidation and extinction of conditioned fear potentially contributes to the development of fear pathology including posttraumatic stress disorder (PTSD). Pavlovian fear conditioning is a key tool for the study of fundamental aspects of fear learning. Here, we used a selected mouse line of High and Low Pavlovian conditioned fear created from an advanced intercrossed line (AIL) in order to begin to identify the cellular basis of phenotypic divergence in Pavlovian fear conditioning. We investigated whether phosphorylated MAPK (p44/42 ERK/MAPK), a protein kinase required in the amygdala for the acquisition and consolidation of Pavlovian fear memory, is differentially expressed following Pavlovian fear learning in the High and Low fear lines. We found that following Pavlovian auditory fear conditioning, High and Low line mice differ in the number of pMAPK-expressing neurons in the dorsal sub nucleus of the lateral amygdala (LAd). In contrast, this difference was not detected in the ventral medial (LAvm) or ventral lateral (LAvl) amygdala sub nuclei or in control animals. We propose that this apparent increase in plasticity at a known locus of fear memory acquisition and consolidation relates to intrinsic differences between the two fear phenotypes. These data provide important insights into the micronetwork mechanisms encoding phenotypic differences in fear. Understanding the circuit level cellular and molecular mechanisms that underlie individual variability in fear learning is critical for the development of effective treatment of fear-related illnesses such as PTSD.
Resumo:
INTRODUCTION There is evidence that the reduction of blood perfusion caused by closed soft tissue trauma (CSTT) delays the healing of the affected soft tissues and bone [1]. We hypothesise that the characterisation of vascular morphology changes (VMC) following injury allows us to determine the effect of the injury on tissue perfusion and thereby the severity of the injury. This research therefore aims to assess the VMC following CSTT in a rat model using contrast-enhanced micro-CT imaging. METHODOLOGY A reproducible CSTT was created on the left leg of anaesthetized rats (male, 12 weeks) with an impact device. After euthanizing the animals at 6 and 24 hours following trauma, the vasculature was perfused with a contrast agent (Microfil, Flowtech, USA). Both hind-limbs were dissected and imaged using micro-CT for qualitative comparison of the vascular morphology and quantification of the total vascular volume (VV). In addition, biopsy samples were taken from the CSTT region and scanned to compare morphological parameters of the vasculature between the injured and control limbs. RESULTS AND DISCUSSION While the visual observation of the hindlimb scans showed consistent perfusion of the microvasculature with microfil, enabling the identification of all major blood vessels, no clear differences in the vascular architecture were observed between injured and control limbs. However, overall VV within the region of interest (ROI)was measured to be higher for the injured limbs after 24h. Also, scans of biopsy samples demonstrated that vessel diameter and density were higher in the injured legs 24h after impact. CONCLUSION We believe these results will contribute to the development of objective diagnostic methods for CSTT based on changes to the microvascular morphology as well as aiding in the validation of future non-invasive clinical assessment modalities.
Resumo:
Diagnosis threat is a psychosocial factor that has been proposed to contribute to poor outcomes following mild traumatic brain injury (mTBI). This threat is thought to impair the cognitive test performance of individuals with mTBI because of negative injury stereotypes. University students (N= 45, 62.2% female) with a history of mTBI were randomly allocated to a diagnosis threat (DT, n=15), reduced threat (DT-reduced, n=15) or neutral (n=15) group. The reduced threat condition invoked a positive stereotype (i.e., that people with mTBI can perform well on cognitive tests). All participants were given neutral instructions before they completed baseline tests of: a) objective cognitive function across a number of domains; b) psychological symptoms; and, c) PCS symptoms, including self-reported cognitive and emotional difficulties. Participants then received either neutral, DT or DT-reduced instructions, before repeating the tests. Results were analyzed using separate mixed model ANOVAs; one for each dependent measure. The only significant result was for the 2 X 3 ANOVA on an objective test of attention/working memory, Digit Span, p<.05, such that the DT-reduced group performed better than the other groups, which were not different from each other. Although not consistent with predictions or earlier DT studies, the absence of group differences on most tests fits with several recent DT findings. The results of this study suggest that it is timely to reconsider the role of DT as a unique contributor to poor mTBI outcome.
Resumo:
Background Acute respiratory illness, a leading cause of cough in children, accounts for a substantial proportion of childhood morbidity and mortality worldwide. In some children acute cough progresses to chronic cough (> 4 weeks duration), impacting on morbidity and decreasing quality of life. Despite the importance of chronic cough as a cause of substantial childhood morbidity and associated economic, family and social costs, data on the prevalence, predictors, aetiology and natural history of the symptom are scarce. This study aims to comprehensively describe the epidemiology, aetiology and outcomes of cough during and after acute respiratory illness in children presenting to a tertiary paediatric emergency department. Methods/design A prospective cohort study of children aged <15 years attending the Royal Children's Hospital Emergency Department, Brisbane, for a respiratory illness that includes parent reported cough (wet or dry) as a symptom. The primary objective is to determine the prevalence and predictors of chronic cough (>= 4 weeks duration) post presentation with acute respiratory illness. Demographic, epidemiological, risk factor, microbiological and clinical data are completed at enrolment. Subjects complete daily cough dairies and weekly follow-up contacts for 28(+/-3) days to ascertain cough persistence. Children who continue to cough for 28 days post enrolment are referred to a paediatric respiratory physician for review. Primary analysis will be the proportion of children with persistent cough at day 28(+/-3). Multivariate analyses will be performed to evaluate variables independently associated with chronic cough at day 28(+/-3). Discussion Our protocol will be the first to comprehensively describe the natural history, epidemiology, aetiology and outcomes of cough during and after acute respiratory illness in children. The results will contribute to studies leading to the development of evidence-based clinical guidelines to improve the early detection and management of chronic cough in children during and after acute respiratory illness.
Resumo:
Background Australian Indigenous children are the only population worldwide to receive the 7-valent pneumococcal conjugate vaccine (7vPCV) at 2, 4, and 6 months of age and the 23-valent pneumococcal polysaccharide vaccine (23vPPV) at 18 months of age. We evaluated this program's effectiveness in reducing the risk of hospitalization for acute lower respiratory tract infection (ALRI) in Northern Territory (NT) Indigenous children aged 5-23 months. Methods We conducted a retrospective cohort study involving all NT Indigenous children born from 1 April 2000 through 31 October 2004. Person-time at-risk after 0, 1, 2, and 3 doses of 7vPCV and after 0 and 1 dose of 23vPPV and the number of ALRI following each dose were used to calculate dose-specific rates of ALRI for children 5-23 months of age. Rates were compared using Cox proportional hazards models, with the number of doses of each vaccine serving as time-dependent covariates. Results There were 5482 children and 8315 child-years at risk, with 2174 episodes of ALRI requiring hospitalization (overall incidence, 261 episodes per 1000 child-years at risk). Elevated risk of ALRI requiring hospitalization was observed after each dose of the 7vPCV vaccine, compared with that for children who received no doses, and an even greater elevation in risk was observed after each dose of the 23vPPV ( adjusted hazard ratio [HR] vs no dose, 1.39; 95% confidence interval [CI], 1.12-1.71;). Risk was highest among children Pp. 002 vaccinated with the 23vPPV who had received < 3 doses of the 7vPCV (adjusted HR, 1.81; 95% CI, 1.32-2.48). Conclusions Our results suggest an increased risk of ALRI requiring hospitalization after pneumococcal vaccination, particularly after receipt of the 23vPPV booster. The use of the 23vPPV booster should be reevaluated.
Resumo:
OBJECTIVE: We present and analyze long-term outcomes following multimodal therapy for esophageal cancer, in particular the relative impact of histomorphologic tumor regression and nodal status. PATIENTS AND METHODS: A total of 243 patients [(adenocarcinoma (n = 170) and squamous cell carcinoma (n = 73)] treated with neoadjuvant chemoradiotherapy in the period 1990 to 2004 were followed prospectively with a median follow-up of 60 months. Pathologic stage and tumor regression grade (TRG) were documented, the site of first failure was recorded, and Kaplan-Meier survival curves were plotted. RESULTS: Thirty patients (12%) did not undergo surgery due to disease progression or deteriorated performance status. Forty-one patients (19%) had a complete pathologic response (pCR), and there were 31(15%) stage I, 69 (32%) stage II, and 72 (34%) stage III cases. The overall median survival was 18 months, and the 5-year survival was 27%. The 5-year survival of patients achieving a pCR was 50% compared with 37% in non-pCR patients who were node-negative (P = 0.86). Histomorphologic tumor regression was not associated with pre-CRT cTN stage but was significantly (P < 0.05) associated with ypN stage. By multivariate analysis, ypN status (P = 0.002) was more predictive of overall survival than TRG (P = 0.06) or ypT stage (P = 0.39). CONCLUSION: Achieving a node-negative status is the major determinant of outcome following neoadjuvant chemoradiotherapy. Histomorphologic tumor regression is less predictive of outcome than pathologic nodal status (ypN), and the need to include a primary site regression score in a new staging classification is unclear. © 2007 Lippincott Williams & Wilkins, Inc.
Resumo:
Background. This study evaluated the time course of recovery of transverse strain in the Achilles and patellar tendons following a bout of resistance exercise. Methods. Seventeen healthy adults underwent sonographic examination of the right patellar (n = 9) or Achilles (n = 8) tendons immediately prior to and following 90 repetitions of weight–bearing exercise. Quadriceps and gastrocnemius exercise were performed against an effective resistance of 175% and 250% body weight, respectively. Sagittal tendon thickness was determined 20 mm from the tendon enthesis and transverse strain was repeatedly monitored over a 24 hour recovery period. Results. Resistance exercise resulted in an immediate decrease in Achilles (t7 = 10.6, P<.01) and patellar (t8 = 8.9, P<.01) tendon thickness, resulting in an average transverse strain of 0.14 ± 0.04 and 0.18 ± 0.05. While the average strain was not significantly different between tendons, older age was associated with a reduced transverse strain response (r=0.63, P<.01). Recovery of transverse strain, in contrast, was prolonged compared with the duration of loading and exponential in nature. The mean primary recovery time was not significantly different between Achilles (6.5 ± 3.2 hours) and patellar (7.1 ± 3.2 hours) tendons and body weight accounted for 62% and 64% of the variation in recovery time, respectively. Discussion. Despite structural and biochemical differences between the Achilles and patellar tendons [1], the mechanisms underlying transverse creep–recovery in vivo appear similar and are highly time dependent. Primary recovery required about 7 hours in healthy tendons, with full recovery requiring up to 24 hours. These in vivo recovery times are similar to those reported for axial creep recovery of the vertebral disc in vitro [2], and may be used clinically to guide physical activity to rest ratios in healthy adults. Optimal ratios for high–stress tendons in clinical populations, however, remain unknown and require further attention in light of the knowledge gained in this study.
Resumo:
Introduction: Ondansetron is a 5-HT3 receptor antagonist commonly used as an anti-emetic to prevent the nausea and vomiting associated with anti-cancer drugs, cancer radiotherapy, or postoperatively. Recently, the US Food and Drug Administration (FDA) issued a warning for ondansetron due to a potential for prolongation of the QT interval of the electrocardiogram (ECG), a phenomenon that is associated with an increased risk of the potentially fatal arrhythmia torsade de pointes. Areas covered: We undertook a review of the cardiac safety of ondansetron. Our primary sources of information were PubMed (with downloading of full articles), and the internet. Expert opinion: The dose of ondansetron that the FDA has concerns about is 32 mg iv (or several doses that are equivalent to this), which is only used in preventing nausea and vomiting associated with cancer chemotherapy. This suggests that ondansetron may be safe in the lower doses used to prevent the nausea and vomiting in radiation treatment or postoperatively. However, as there is a report that a lower dose of ondansetron prolonged the QT interval in healthy volunteers, this needs to be clarified by the FDA. More research needs to be undertaken of the relationship between QT prolongation and torsades in order that the FDA can produce clear-cut evidence of pro-arrhythmic risk when introducing warnings for this.
Resumo:
Low circulating folate concentrations lead to elevations of plasma homocysteine. Even mild elevations of plasma homocysteine are associated with significantly increased risk of cardiovascular disease (CVD). Available evidence suggests that poor nutrition contributes to excessive premature CVD mortality in Australian Aboriginal people. The aim of the present study was to examine the effect of a nutrition intervention program conducted in an Aboriginal community on plasma homocysteine concentrations in a community-based cohort. From 1989, a health and nutrition project was developed, implemented and evaluated with the people of a remote Aboriginal community. Plasma homocysteine concentrations were measured in a community-based cohort of 14 men and 21 women screened at baseline, 6 months and 12 months. From baseline to 6 months there was a fall in mean plasma homocysteine of over 2|mol/L (P = 0.006) but no further change thereafter (P = 0.433). These changes were associated with a significant increase in red cell folate concentration from baseline to 6 months (P < 0.001) and a further increase from 6 to 12 months (P < 0.001). In multiple regression analysis, change in homocysteine concentration from baseline to 6 months was predicted by change in red cell folate (P = 0.002) and baseline homocysteine (P < 0.001) concentrations, but not by age, gender or baseline red cell folate concentration. We conclude that modest improvements in dietary quality among populations with poor nutrition (and limited disposable income) can lead to reductions in CVD risk.
Resumo:
Cryotherapy is currently used in various clinical, rehabilitative, and sporting settings. However, very little is known regarding the impact of cooling on the microcirculatory response. Objectives: The present study sought to examine the influence of two commonly employed modalities of cryotherapy, whole body cryotherapy (WBC; -110°C) and cold water immersion(CWI; 8±1°C), on skin microcirculation in the mid- thigh region. Methods: The skin area examined was a 3 × 3 cm located between the most anterior aspect of the inguinal fold and the patella. Following 10 minutes of rest, 5 healthy, active males were exposed to either WBC for 3 minutes or CWI for 5 minutes in a randomised order. Volunteers lay supine for five minutes after treatment, in order to monitor the variation of red blood cell (RBC) concentration in the region of interest for a duration of 40 minutes. Microcirculation response was assessed using a non-invasive, portable instrument known as a Tissue Viability imaging system. After a minimum of seven days, the protocol was repeated. Subjective assessment of the volunteer’s thermal comfort and thermal sensation was also recorded. Results: RBC was altered following exposure to both WBC and CWI but appeared to stabilise approximately 35 minutes after treatments. Both WBC and CWI affected thermal sensation (p < 0.05); however no betweengroup differences in thermal comfort or sensation were recorded (p > 0.05). Conclusions: As both WBC and CWI altered RBC, further study is necessary to examine the mechanism for this alteration during whole body cooling.
Resumo:
Introduction Sleep restriction and missing 1 night’s continuous positive air pressure (CPAP) treatment are scenarios faced by obstructive sleep apnoea (OSA) patients, who must then assess their own fitness to drive. This study aims to assess the impact of this on driving performance. Method 11 CPAP treated participants (50–75 yrs), drove an interactive car simulator under monotonous motorway conditions for 2 hours on 3 afternoons, following;(i)normal night’s sleep (average 8.2 h) with CPAP (ii) sleep restriction (5 h), with CPAP (iii)normal length of sleep, without CPAP. Driving incidents were noted if the car came out of the designated driving lane. EEG was recorded continually and KSS reported every 200 seconds. Results Driving incidents: Incidents were more prevalent following CPAP withdrawal during hour 1, demonstrating a significant condition time interaction [F(6,60) = 3.40, p = 0.006]. KSS: At the start of driving participants felt sleepiest following CPAP withdrawal, by the end of the task KSS levels were similar following CPAP withdrawal and sleep restriction, demonstrating a significant condition, time interaction [F(3.94,39.41) = 3.39, p = 0.018]. EEG: There was a non significant trend for combined alpha and theta activity to be highest throughout the drive following CPAP withdrawal. Discussion CPAP withdrawal impairs driving simulator performance sooner than restricting sleep to 5 h with CPAP. Participants had insight into this increased sleepiness reflected by the higher KSS reported following CPAP withdrawal. In the practical terms of driving any one incident could be fatal. The earlier impairment reported here demonstrates the potential danger of missing CPAP treatment and highlights the benefit of CPAP treatment even when sleep time is short.
Resumo:
Objectives The UK Department for Transport recommends taking a break from driving every 2 h. This study investigated: (i) if a 2 h drive time on a monotonous road is appropriate for OSA patients treated with CPAP, compared with healthy age matched controls, (ii) the impact of a night’s sleep restriction (with CPAP) and (iii) what happens if these patients miss one nights’ CPAP treatment. Methods About 19 healthy men aged 52–74 y (m = 66.2 y) and 19 OSA participants aged 50–75 y (m = 64.4 y) drove an interactive car simulator under monotonous motorway conditions for 2 h on two afternoons, in a counterbalanced design; (1) following a normal night’s sleep (8 h). (2) following a restricted night’s sleep (5 h), with normal CPAP use (3) following a night without CPAP treatment. (n = 11) Lane drifting incidents, indicative of falling asleep, were recorded for up to 2 h depending on competence to continue driving. Results Normal sleep: Controls drove for an average of 95.9 min (s.d. 37 min) and treated OSA drivers for 89.6 min (s.d. 29 min) without incident. 63.2% of controls and 42.1% of OSA drivers successfully completed the drive without an incident. Sleep restriction: 47.4% of controls and 26.3% OSA drivers finished without incident. Overall: controls drove for an average of 89.5 min (s.d. 39 min) and treated OSA drivers 65 min (s.d. 42 min) without incident. The effect of condition was significant [F(1.36) = 9.237, P < 0.05, eta2 = 0.204]. Stopping CPAP: 18.2% of drivers successfully completed the drive. Overall, participants drove for an average of 50.1 min (s.d. 38 min) without incident. The effect of condition was significant [F(2) = 8.8, P < 0.05, eta2 = 0.468]. Conclusion 52.6% of all drivers were able to complete a 2 hour drive under monotonous conditions after a full night’s sleep. Sleep restriction significantly affected both control and OSA drivers. We find evidence that treated OSA drivers are more impaired by sleep restriction than healthy control, as they were less able to sustain safely the 2 h drive without incidents. OSA drivers should be aware that non-compliance with CPAP can significantly impair driving performance. It may be appropriate to recommend older drivers take a break from driving every 90 min especially when undertaking a monotonous drive, as was the case here.