996 resultados para POSTURAL RESPONSES
Resumo:
Driving and using prescription medicines that have the potential to impair driving is an emerging research area. To date it is characterised by a limited (although growing) number of studies and methodological complexities that make generalisations about impairment due to medications difficult. Consistent evidence has been found for the impairing effects of hypnotics, sedative antidepressants and antihistamines, and narcotic analgesics, although it has been estimated that as many as nine medication classes have the potential to impair driving (Alvarez & del Rio, 2000; Walsh, de Gier, Christopherson, & Verstraete, 2004). There is also evidence for increased negative effects related to concomitant use of other medications and alcohol (Movig et al., 2004; Pringle, Ahern, Heller, Gold, & Brown, 2005). Statistics on the high levels of Australian prescription medication use suggest that consumer awareness of driving impairment due to medicines should be examined. One web-based study has found a low level of awareness, knowledge and risk perceptions among Australian drivers about the impairing effects of various medications on driving (Mallick, Johnston, Goren, & Kennedy, 2007). The lack of awareness and knowledge brings into question the effectiveness of the existing countermeasures. In Australia these consist of the use of ancillary warning labels administered under mandatory regulation and professional guidelines, advice to patients, and the use of Consumer Medicines Information (CMI) with medications that are known to cause impairment. The responsibility for the use of the warnings and related counsel to patients primarily lies with the pharmacist when dispensing relevant medication. A review by the Therapeutic Goods Administration (TGA) noted that in practice, advice to patients may not occur and that CMI is not always available (TGA, 2002). Researchers have also found that patients' recall of verbal counsel is very low (Houts, Bachrach, Witmer, Tringali, Bucher, & Localio, 1998). With healthcare observed as increasingly being provided in outpatient conditions (Davis et al., 2006; Vingilis & MacDonald, 2000), establishing the effectiveness of the warning labels as a countermeasure is especially important. There have been recent international developments in medication categorisation systems and associated medication warning labels. In 2005, France implemented a four-tier medication categorisation and warning system to improve patients' and health professionals' awareness and knowledge of related road safety issues (AFSSAPS, 2005). This warning system uses a pictogram and indicates the level of potential impairment in relation to driving performance through the use of colour and advice on the recommended behaviour to adopt towards driving. The comparable Australian system does not indicate the severity level of potential effects, and does not provide specific guidelines on the attitude or actions that the individual should adopt towards driving. It is reliant upon the patient to be vigilant in self-monitoring effects, to understand the potential ways in which they may be affected and how serious these effects may be, and to adopt the appropriate protective actions. This thesis investigates the responses of a sample of Australian hospital outpatients who receive appropriate labelling and counselling advice about potential driving impairment due to prescribed medicines. It aims to provide baseline data on the understanding and use of relevant medications by a Queensland public hospital outpatient sample recruited through the hospital pharmacy. It includes an exploration and comparison of the effect of the Australian and French medication warning systems on medication user knowledge, attitudes, beliefs and behaviour, and explores whether there are areas in which the Australian system may be improved by including any beneficial elements of the French system. A total of 358 outpatients were surveyed, and a follow-up telephone survey was conducted with a subgroup of consenting participants who were taking at least one medication that required an ancillary warning label about driving impairment. A complementary study of 75 French hospital outpatients was also conducted to further investigate the performance of the warnings. Not surprisingly, medication use among the Australian outpatient sample was high. The ancillary warning labels required to appear on medications that can impair driving were prevalent. A subgroup of participants was identified as being potentially at-risk of driving impaired, based on their reported recent use of medications requiring an ancillary warning label and level of driving activity. The sample reported previous behaviour and held future intentions that were consistent with warning label advice and health protective action. Participants did not express a particular need for being advised by a health professional regarding fitness to drive in relation to their medication. However, it was also apparent from the analysis that the participants would be significantly more likely to follow advice from a doctor than a pharmacist. High levels of knowledge in terms of general principles about effects of alcohol, illicit drugs and combinations of substances, and related health and crash risks were revealed. This may reflect a sample specific effect. Emphasis is placed in the professional guidelines for hospital pharmacists that make it essential that advisory labels are applied to medicines where applicable and that warning advice is given to all patients on medication which may affect driving (SHPA, 2006, p. 221). The research program applied selected theoretical constructs from Schwarzer's (1992) Health Action Process Approach, which has extended constructs from existing health theories such as the Theory of Planned Behavior (Ajzen, 1991) to better account for the intention-behaviour gap often observed when predicting behaviour. This was undertaken to explore the utility of the constructs in understanding and predicting compliance intentions and behaviour with the mandatory medication warning about driving impairment. This investigation revealed that the theoretical constructs related to intention and planning to avoid driving if an effect from the medication was noticed were useful. Not all the theoretical model constructs that had been demonstrated to be significant predictors in previous research on different health behaviours were significant in the present analyses. Positive outcome expectancies from avoiding driving were found to be important influences on forming the intention to avoid driving if an effect due to medication was noticed. In turn, intention was found to be a significant predictor of planning. Other selected theoretical constructs failed to predict compliance with the Australian warning label advice. It is possible that the limited predictive power of a number of constructs including risk perceptions is due to the small sample size obtained at follow up on which the evaluation is based. Alternately, it is possible that the theoretical constructs failed to sufficiently account for issues of particular relevance to the driving situation. The responses of the Australian hospital outpatient sample towards the Australian and French medication warning labels, which differed according to visual characteristics and warning message, were examined. In addition, a complementary study with a sample of French hospital outpatients was undertaken in order to allow general comparisons concerning the performance of the warnings. While a large amount of research exists concerning warning effectiveness, there is little research that has specifically investigated medication warnings relating to driving impairment. General established principles concerning factors that have been demonstrated to enhance warning noticeability and behavioural compliance have been extrapolated and investigated in the present study. The extent to which there is a need for education and improved health messages on this issue was a core issue of investigation in this thesis. Among the Australian sample, the size of the warning label and text, and red colour were the most visually important characteristics. The pictogram used in the French labels was also rated highly, and was salient for a large proportion of the sample. According to the study of French hospital outpatients, the pictogram was perceived to be the most important visual characteristic. Overall, the findings suggest that the Australian approach of using a combination of visual characteristics was important for the majority of the sample but that the use of a pictogram could enhance effects. A high rate of warning recall was found overall and a further important finding was that higher warning label recall was associated with increased number of medication classes taken. These results suggest that increased vigilance and care are associated with the number of medications taken and the associated repetition of the warning message. Significantly higher levels of risk perception were found for the French Level 3 (highest severity) label compared with the comparable mandatory Australian ancillary Label 1 warning. Participants' intentions related to the warning labels indicated that they would be more cautious while taking potentially impairing medication displaying the French Level 3 label compared with the Australian Label 1. These are potentially important findings for the Australian context regarding the current driving impairment warnings about displayed on medication. The findings raise other important implications for the Australian labelling context. An underlying factor may be the differences in the wording of the warning messages that appear on the Australian and French labels. The French label explicitly states "do not drive" while the Australian label states "if affected, do not drive", and the difference in responses may reflect that less severity is perceived where the situation involves the consumer's self-assessment of their impairment. The differences in the assignment of responsibility by the Australian (the consumer assesses and decides) and French (the doctor assesses and decides) approaches for the decision to drive while taking medication raises the core question of who is most able to assess driving impairment due to medication: the consumer, or the health professional? There are pros and cons related to knowledge, expertise and practicalities with either option. However, if the safety of the consumer is the primary aim, then the trend towards stronger risk perceptions and more consistent and cautious behavioural intentions in relation to the French label suggests that this approach may be more beneficial for consumer safety. The observations from the follow-up survey, although based on a small sample size and descriptive in nature, revealed that just over half of the sample recalled seeing a warning label about driving impairment on at least one of their medications. The majority of these respondents reported compliance with the warning advice. However, the results indicated variation in responses concerning alcohol intake and modifying the dose of medication or driving habits so that they could continue to drive, which suggests that the warning advice may not be having the desired impact. The findings of this research have implications for current countermeasures in this area. These have included enhancing the role that prescribing doctors have in providing warnings and advice to patients about the impact that their medication can have on driving, increasing consumer perceptions of the authority of pharmacists on this issue, and the reinforcement of the warning message. More broadly, it is suggested that there would be benefit in a wider dissemination of research-based information on increased crash risk and systematic monitoring and publicity about the representation of medications in crashes resulting in injuries and fatalities. Suggestions for future research concern the continued investigation of the effects of medications and interactions with existing medical conditions and other substances on driving skills, effects of variations in warning label design, individual behaviours and characteristics (particularly among those groups who are dependent upon prescription medication) and validation of consumer self-assessment of impairment.
Resumo:
The main factors affecting environmental sensitivity to degradation are soil, vegetation, climate and management, through either their intrinsic characteristics or by their interaction on the landscape. Different levels of degradation risks may be observed in response to particular combinations of the aforementioned factors. For instance, the combination of inappropriate management practices and intrinsically weak soil conditions will result in a severe degradation of the environment, while the combination of the same type of management with better soil conditions may lead to negligible degradation.The aim of this study was to identify factors and their impact on land degradation processes in three areas of the Basilicata region (southern Italy) using a procedure that couples environmental indices, GIS and crop-soil simulation models. Areas prone to desertification were first identified using the Environmental Sensitive Areas (ESA) procedure. An analysis for identifying the weight that each of the contributing factor (climate, soil, vegetation, management) had on the ESA was carried out using GIS techniques. The SALUS model was successfully executed to identify the management practices that could lead to better soil conditions to enhance land use sustainability. The best management practices were found to be those that minimized soil disturbance and increased soil organic carbon. Two alternative scenarios with improved soil quality and subsequently improving soil water holding capacity were used as mitigation measures. The ESA were recalculated and the effects of the mitigation measures suggested by the model were assessed. The new ESA showed a significant reduction on land degradation.
Resumo:
The somatosensory system plays an important role in balance control and age-related changes to this system have been implicated in falls. Parkinson’s disease (PD) is a chronic and progressive disease of the brain, characterized by postural instability and gait disturbance. Previous research has shown that deficiencies in somatosensory feedback may contribute to the poorer postural control demonstrated by PD individuals. However, few studies have comprehensively explored differences in somatosensory function and postural control between PD participants and healthy older individuals. The soles of the feet contain many cutaneous mechanoreceptors that provide important somatosensory information sources for postural control. Different types of insole devices have been developed to enhance this somatosensory information and improve postural stability, but these devices are often too complex and expensive to integrate into daily life. Textured insoles provide a more passive intervention that may be an inexpensive and accessible means to enhance the somatosensory input from the plantar surface of the feet. However, to date, there has been little work conducted to test the efficacy of enhanced somatosensory input induced by textured insoles in both healthy and PD populations during standing and walking. Therefore, the aims of this thesis were to determine: 1) whether textured insole surfaces can improve postural stability by enhancing somatosensory information in younger and older adults, 2) the differences between healthy older participants and PD participants for measures of physiological function and postural stability during standing and walking, 3) how changes in somatosensory information affect postural stability in both groups during standing and walking; and 4), whether textured insoles can improve postural stability in both groups during standing and walking. To address these aims, Study 1 recruited seven older individuals and ten healthy young controls to investigate the effects of two textured insole surfaces on postural stability while performing standing balance tests on a force plate. Participants were tested under three insole surface conditions: 1) barefoot; 2) standing on a hard textured insole surface; and 3), standing on a soft textured insole surface. Measurements derived from the centre of pressure displacement included the range of anterior-posterior and medial-lateral displacement, path length and the 90% confidence elliptical area (C90 area). Results of study 1 revealed a significant Group*Surface*Insole interaction for the four measures. Both textured insole surfaces reduced postural sway for the older group, especially in the eyes closed condition on the foam surface. However, participants reported that the soft textured insole surface was more comfortable and, hence, the soft textured insoles were adopted for Studies 2 and 3. For Study 2, 20 healthy older adults (controls) and 20 participants with Parkinson’s disease were recruited. Participants were evaluated using a series of physiological assessments that included touch sensitivity, vibratory perception, and pain and temperature threshold detection. Furthermore, nerve function and somatosensory evoked potentials tests were utilized to provide detailed information regarding peripheral nerve function for these participants. Standing balance and walking were assessed on different surfaces using a force plate and the 3D Vicon motion analysis system, respectively. Data derived from the force plate included the range of anterior-posterior and medial-lateral sway, while measures of stride length, stride period, cadence, double support time, stance phase, velocity and stride timing variability were reported for the walking assessment. The results of this study demonstrated that the PD group had decrements in somatosensory function compared to the healthy older control group. For electrodiagnosis, PD participants had poorer nerve function than controls, as evidenced by slower nerve conduction velocities and longer latencies in sural nerve and prolonged latency in the P37 somatosensory evoked potential. Furthermore, the PD group displayed more postural sway in both the anterior-posterior and medial-lateral directions relative to controls and these differences were increased when standing on a foam surface. With respect to the gait assessment, the PD group took shorter strides and had a reduced stride period compared with the control group. Furthermore, the PD group spent more time in the stance phase and had increased cadence and stride timing variability than the controls. Compared with walking on the firm surface, the two groups demonstrated different gait adaptations while walking on the uneven surface. Controls increased their stride length and stride period and decreased their cadence, which resulted in a consistent walking velocity on both surfaces. Conversely, while the PD patients also increased their stride period and decreased their cadence and stance period on the uneven surface, they did not increase their stride length and, hence walked slower on the uneven surface. In the PD group, there was a strong positive association between decreased somatosensory function and decreased clinical balance, as assessed by the Tinetti test. Poorer somatosensory function was also strongly positively correlated with the temporospatial gait parameters, especially shorter stride length. Study 3 evaluated the effects of manipulating the somatosensory information from the plantar surface of the feet using textured insoles in the same populations assessed in Study 2. For this study, participants performed the standing and walking balance tests under three footwear conditions: 1) barefoot; 2) with smooth insoles; and 3), with textured insoles. Standing balance and walking were evaluated using a force plate and a Vicon motion analysis system and the data were analysed in the same way outlined for Study 2. The findings showed that the smooth and textured insoles caused different effects on postural control during both the standing and walking trials. Both insoles decreased medial-lateral sway to the same level on the firm surface. The greatest benefits were observed in the PD group while wearing the textured insole. When standing under a more challenging condition on the foam surface with eyes closed, only the textured insole decreased medial-lateral sway in the PD group. With respect to the gait trials, both insoles increased walking velocity, stride length and stride time and decreased cadence, but these changes were more pronounced for the textured insoles. The effects of the textured insoles were evident under challenging conditions in the PD group and increased walking velocity and stride length, while decreasing cadence. Textured insoles were also effective in reducing the time spent in the double support and stance phases of the gait cycle and did not increase stride timing variability, as was the case for the smooth insoles for the PD group. The results of this study suggest that textured insoles, such as those evaluated in this research, may provide a low-cost means of improving postural stability in high-risk groups, such as people with PD, which may act as an important intervention to prevent falls.
Resumo:
Introduction and Methods: This study compared changes in myokine and myogenic genes following resistance exercise (3 sets of 12 repetitions of maximal unilateral knee extension) in 20 elderly men (67.8 ± 1.0 years) and 15 elderly women (67.2 ± 1.5 years). Results: Monocyte chemotactic protein (MCP)-1, macrophage inhibitory protein (MIP)-1β, interleukin (IL)-6 and MyoD mRNA increased significantly (P < 0.05), whereas myogenin and myostatin mRNA decreased significantly after exercise in both groups. Macrophage-1 (Mac-1) and MCP-3 mRNA did not change significantly after exercise in either group. MIP-1β, Mac-1 and myostatin mRNA were significantly higher before and after exercise in men compared with women. In contrast, MCP-3 and myogenin mRNA were significantly higher before and after exercise in the women compared with the men. Conclusions: In elderly individuals, gender influences the mRNA expression of certain myokines and growth factors, both at rest and after resistance exercise. These differences may influence muscle regeneration following muscle injury
Resumo:
We examined the effects of progressive resistance training (PRT) and supplementation with calcium-vitamin D(3) fortified milk on markers of systemic inflammation, and the relationship between inflammation and changes in muscle mass, size and strength. Healthy men aged 50-79 years (n = 180) participated in this 18-month randomized controlled trial that comprised a factorial 2 x 2 design. Participants were randomized to (1) PRT + fortified milk supplement, (2) PRT, (3) fortified milk supplement, or (4) a control group. Participants assigned to PRT trained 3 days per week, while those in the supplement groups consumed 400 ml day(-1) of milk containing 1,000 mg calcium plus 800 IU vitamin D(3). We collected venous blood samples at baseline, 12 and 18 months to measure the serum concentrations of IL-6, TNF-alpha and hs-CRP. There were no exercise x supplement interactions, but serum IL-6 was 29% lower (95% CI, -62, 0) in the PRT group compared with the control group after 12 months. Conversely, IL-6 was 31% higher (95% CI, -2, 65) in the supplement group compared with the non-supplemented groups after 12 and 18 months. These between-group differences did not persist after adjusting for changes in fat mass. In the PRT group, mid-tibia muscle cross-sectional area increased less in men with higher pre-training inflammation compared with those men with lower inflammation (net difference similar to 2.5%, p < 0.05). In conclusion, serum IL-6 concentration decreased following PRT, whereas it increased after supplementation with fortified milk concomitant with changes in fat mass. Furthermore, low-grade inflammation at baseline restricted muscle hypertrophy following PRT.
Resumo:
We compared the effects of an ice-slush beverage (ISB) and a cool liquid beverage (CLB) on cycling performance, changes in rectal temperature (T (re)) and stress responses in hot, humid conditions. Ten trained male cyclists/triathletes completed two exercise trials (75 min cycling at similar to 60% peak power output + 50 min seated recovery + 75% peak power output x 30 min performance trial) on separate occasions in 34A degrees C, 60% relative humidity. During the recovery phase before the performance trial, the athletes consumed either the ISB (mean +/- A SD -0.8 +/- A 0.1A degrees C) or the CLB (18.4 +/- A 0.5A degrees C). Performance time was not significantly different after consuming the ISB compared with the CLB (29.42 +/- A 2.07 min for ISB vs. 29.98 +/- A 3.07 min for CLB, P = 0.263). T (re) (37.0 +/- A 0.3A degrees C for ISB vs. 37.4 +/- A 0.2A degrees C for CLB, P = 0.001) and physiological strain index (0.2 +/- A 0.6 for ISB vs. 1.1 +/- A 0.9 for CLB, P = 0.009) were lower at the end of recovery and before the performance trial after ingestion of the ISB compared with the CLB. Mean thermal sensation was lower (P < 0.001) during recovery with the ISB compared with the CLB. Changes in plasma volume and the concentrations of blood variables (i.e., glucose, lactate, electrolytes, cortisol and catecholamines) were similar between the two trials. In conclusion, ingestion of ISB did not significantly alter exercise performance even though it significantly reduced pre-exercise T (re) compared with CLB. Irrespective of exercise performance outcomes, ingestion of ISB during recovery from exercise in hot humid environments is a practical and effective method for cooling athletes following exercise in hot environments.
Resumo:
We investigated the effect of carbohydrate ingestion after maximal lengthening contractions of the knee extensors on circulating concentrations of myocellular proteins and cytokines, and cytokine mRNA expression in muscle. Using a cross-over design, 10 healthy males completed 5 sets of 10 lengthening (eccentric) contractions (unilateral leg press) at 120% 1 repetition-maximum. Subjects were randomized to consume a carbohydrate drink (15% weight per volume; 3 g/kg BM) for 3 h after exercise using one leg, or a placebo drink after exercise using the contralateral leg on another day. Blood samples (10 mL) were collected before exercise and after 0, 30, 60, 90, 120, 150, and 180 min of recovery. Muscle biopsies (vastus lateralis) were collected before exercise and after 3 h of recovery. Following carbohydrate ingestion, serum concentrations of glucose (30-90 min and at 150 min) and insulin (30-180 min) increased (P < 0.05) above pre-exercise values. Serum myoglobin concentration increased (similar to 250%; P < 0.05) after both trials. In contrast, serum cytokine concentrations were unchanged throughout recovery in both trials. Muscle mRNA expression for IL-8 (6.4-fold), MCP-1 (4.7-fold), and IL-6 (7.3-fold) increased substantially after carbohydrate ingestion. TNF-alpha mRNA expression did not change after either trial. Carbohydrate ingestion during early recovery from exercise-induced muscle injury may promote proinflammatory reactions within skeletal muscle.
Resumo:
Does exercise promote weight loss? One of the key problems with studies assessing the efficacy of exercise as a method of weight management and obesityis that mean data are presented and the individual variability in response is overlooked. Recent data have highlighted the need to demonstrate and characterise the individual variability in response to exercise. Do people who exercise compensate for the increase in energy expenditure via compensatory increases in hunger and food intake? The authors address the physiological, psychological and behavioural factors potentially involved in the relationship between exercise and appetite, and identify the research questions that remain unanswered. A negative consequence of the phenomena of individual variability and compensatory responses has been the focus on those who lose little weight in response to exercise; this has been used unreasonably as evidence to suggest that exercise is a futile method of controlling weight and managing obesity. Most of the evidence suggests that exercise is useful for improving body composition and health. For example, when exercise-induced mean weight loss is <1.0 kg, significant improvements in aerobic capacity (+6.3 ml/kg/min), systolic (−6.00 mm Hg) and diastolic (−3.9 mm Hg) blood pressure, waist circumference (−3.7 cm) and positive mood still occur. However, people will vary in their responses to exercise; understanding and characterising this variability will help tailor weight loss strategies to suit individuals.
Resumo:
Immunotherapy is a promising new treatment for patients with advanced prostate and ovarian cancer, but its application is limited by the lack of suitable target antigens that are recognized by CD8+ cytotoxic T lymphocytes (CTL). Human kallikrein 4 (KLK4) is a member of the kallikrein family of serine proteases that is significantly overexpressed in malignant versus healthy prostate and ovarian tissue, making it an attractive target for immunotherapy. We identified a naturally processed, HLA-A*0201-restricted peptide epitope within the signal sequence region of KLK4 that induced CTL responses in vitro in most healthy donors and prostate cancer patients tested. These CTL lysed HLA-A*0201+ KLK4 + cell lines and KLK4 mRNA-transfected monocyte-derived dendritic cells. CTL specific for the HLA-A*0201-restricted KLK4 peptide were more readily expanded to a higher frequency in vitro compared to the known HLA-A*0201-restricted epitopes from prostate cancer antigens; prostate-specific antigen (PSA), prostate-specific membrane antigen (PSMA) and prostatic acid phosphatase (PAP). These data demonstrate that KLK4 is an immunogenic molecule capable of inducing CTL responses and identify it as an attractive target for prostate and ovarian cancer immunotherapy.
Resumo:
Human papillomaviruses (HPVs) are obligate epithelial pathogens and typically cause localized mucosal infections. We therefore hypothesized that T-cell responses to HPV antigens would be greater at sites of pathology than in the blood. Focusing on HPV-16 because of its association with cervical cancer, the magnitude of HPV-specific T-cell responses at the cervix was compared with those in the peripheral blood by intracellular cytokine staining following direct ex vivo stimulation with both virus-like particles assembled from the major capsid protein L1, and the major HPV oncoprotein, E7. We show that both CD4 + and CD8 + T cells from the cervix responded to the HPV-16 antigens and that interferon-γ (IFN-γ) production was HPV type-specific. Comparing HPV-specific T-cell IFN-γ responses at the cervix with those in the blood, we found that while CD4 + and CD8 + T-cell responses to L1 were significantly correlated between compartments (P = 0.02 and P = 0.05, respectively), IFN-γ responses in both T-cell subsets were significantly greater in magnitude at the cervix than in peripheral blood (P = 0.02 and P = 0.003, respectively). In contrast, both CD4 + and CD8 + T-cell IFN-γ responses to E7 were of similar magnitude in both compartments and CD8 + responses were significantly correlated between these distinct immunological compartments (P = 0.04). We therefore show that inflammatory T-cell responses against L1 (but not E7) demonstrate clear compartmental bias and the magnitude of these responses do reflect local viral replication but that correlation of HPV-specific responses between compartments indicates their linkage.
Resumo:
Mycobacterium bovis BCG is considered an attractive live bacterial vaccine vector. In this study, we investigated the immune response of baboons to a primary vaccination with recombinant BCG (rBCG) constructs expressing the gag gene from a South African HIV-1 subtype C isolate, and a boost with HIV-1 subtype C Pr55 gag virus-like particles (Gag VLPs). Using an interferon enzyme-linked immunospot assay, we show that although these rBCG induced only a weak or an undetectable HIV-1 Gag-specific response on their own, they efficiently primed for a Gag VLP boost, which strengthened and broadened the immune responses. These responses were predominantly CD8+ T cell-mediated and recognised similar epitopes as those targeted by humans with early HIV-1 subtype C infection. In addition, a Gag-specific humoral response was elicited. These data support the development of HIV-1 vaccines based on rBCG and Pr55 gag VLPs. © 2009 Elsevier Ltd. All rights reserved.
Resumo:
Several approaches have been explored to eradicate HIV; however, a multigene vaccine appears to be the best option, given their proven potential to elicit broad, effective responses in animal models. The Pr55 Gagprotein is an excellent vaccine candidate in its own right, given that it can assemble into large, enveloped, virus-like particles (VLPs) which are highly immunogenic, and can moreover be used as a scaffold for the presentation of other large non-structural HIV antigens. In this study, we evaluated the potential of two novel chimaeric HIV-1 Pr55 Gag-based VLP constructs - C-terminal fusions with reverse transcriptase and a Tat::Nef fusion protein, designated GagRT and GagTN respectively - to enhance a cellular response in mice when used as boost components in two types of heterologous prime-boost vaccine strategies. A vaccine regimen consisting of a DNA prime and chimaeric HIV-1 VLP boosts in mice induced strong, broad cellular immune responses at an optimum dose of 100 ng VLPs. The enhanced cellular responses induced by the DNA prime-VLP boost were two- to three-fold greater than two DNA vaccinations. Moreover, a mixture of GagRT and GagTN VLPs also boosted antigen-specific CD8+ and CD4+ T-cell responses, while VLP vaccinations only induced predominantly robust Gag CD4+ T-cell responses. The results demonstrate the promising potential of these chimaeric VLPs as vaccine candidates against HIV-1. © 2010 Pillay et al; licensee BioMed Central Ltd.
Resumo:
A characteristic of Parkinson's disease (PD) is the development of tremor within the 4–6 Hz range. One method used to better understand pathological tremor is to compare the responses to tremor-type actions generated intentionally in healthy adults. This study was designed to investigate the similarities and differences between voluntarily generated 4–6 Hz tremor and PD tremor in regards to their amplitude, frequency and coupling characteristics. Tremor responses for 8 PD individuals (on- and off-medication) and 12 healthy adults were assessed under postural and resting conditions. Results showed that the voluntary and PD tremor were essentially identical with regards to the amplitude and peak frequency. However, differences between the groups were found for the variability (SD of peak frequency, proportional power) and regularity (Approximate Entropy, ApEn) of the tremor signal. Additionally, coherence analysis revealed strong inter-limb coupling during voluntary conditions while no bilateral coupling was seen for the PD persons. Overall, healthy participants were able to produce a 5 Hz tremulous motion indistinguishable to that of PD patients in terms of peak frequency and amplitude. However, differences in the structure of variability and level of inter-limb coupling were found for the tremor responses of the PD and healthy adults. These differences were preserved irrespective of the medication state of the PD persons. The results illustrate the importance of assessing the pattern of signal structure/variability to discriminate between different tremor forms, especially where no differences emerge in standard measures of mean amplitude as traditionally defined.
Resumo:
The aim of this study was to investigate the effect of court surface (clay v hard-court) on technical, physiological and perceptual responses to on-court training. Four high-performance junior male players performed two identical training sessions on hard and clay courts, respectively. Sessions included both physical conditioning and technical elements as led by the coach. Each session was filmed for later notational analysis of stroke count and error rates. Further, players wore a global positioning satellite device to measure distance covered during each session; whilst heart rate, countermovement jump distance and capillary blood measures of metabolites were measured before, during and following each session. Additionally a respective coach and athlete rating of perceived exertion (RPE) were measured following each session. Total duration and distance covered during of each session were comparable (P>0.05; d<0.20). While forehand and backhands stroke volume did not differ between sessions (P>0.05; d<0.30); large effects for increased unforced and forced errors were present on the hard court (P>0.05; d>0.90). Furthermore, large effects for increased heart rate, blood lactate and RPE values were evident on clay compared to hard courts (P>0.05; d>0.90). Additionally, while player and coach RPE on hard courts were similar, there were large effects for coaches to underrate the RPE of players on clay courts (P>0.05; d>0.90). In conclusion, training on clay courts results in trends for increased heart rate, lactate and RPE values, suggesting sessions on clay tend towards higher physiological and perceptual loads than hard courts. Further, coaches appear effective at rating player RPE on hard courts, but may underrate the perceived exertion of sessions on clay courts.