504 resultados para Dickinson, Jonathan, 1688-1747
Resumo:
Background Aphasia is an acquired language disorder that can present a significant barrier to patient involvement in healthcare decisions. Speech-language pathologists (SLPs) are viewed as experts in the field of communication. However, many SLP students do not receive practical training in techniques to communicate with people with aphasia (PWA) until they encounter PWA during clinical education placements. Methods This study investigated the confidence and knowledge of SLP students in communicating with PWA prior to clinical placements using a customised questionnaire. Confidence in communicating with people with aphasia was assessed using a 100-point visual analogue scale. Linear, and logistic, regressions were used to examine the association between confidence and age, as well as confidence and course type (graduate-entry masters or undergraduate), respectively. Knowledge of strategies to assist communication with PWA was examined by asking respondents to list specific strategies that could assist communication with PWA. Results SLP students were not confident with the prospect of communicating with PWA; reporting a median 29-points (inter-quartile range 17–47) on the visual analogue confidence scale. Only, four (8.2%) of respondents rated their confidence greater than 55 (out of 100). Regression analyses indicated no relationship existed between confidence and students‘ age (p = 0.31, r-squared = 0.02), or confidence and course type (p = 0.22, pseudo r-squared = 0.03). Students displayed limited knowledge about communication strategies. Thematic analysis of strategies revealed four overarching themes; Physical, Verbal Communication, Visual Information and Environmental Changes. While most students identified potential use of resources (such as images and written information), fewer students identified strategies to alter their verbal communication (such as reduced speech rate). Conclusions SLP students who had received aphasia related theoretical coursework, but not commenced clinical placements with PWA, were not confident in their ability to communicate with PWA. Students may benefit from an educational intervention or curriculum modification to incorporate practical training in effective strategies to communicate with PWA, before they encounter PWA in clinical settings. Ensuring students have confidence and knowledge of potential communication strategies to assist communication with PWA may allow them to focus their learning experiences in more specific clinical domains, such as clinical reasoning, rather than building foundation interpersonal communication skills.
Resumo:
Plant growth can be limited by resource acquisition and defence against consumers, leading to contrasting trade-off possibilities. The competition-defence hypothesis posits a trade-off between competitive ability and defence against enemies (e.g. herbivores and pathogens). The growth-defence hypothesis suggests that strong competitors for nutrients are also defended against enemies, at a cost to growth rate. We tested these hypotheses using observations of 706 plant populations of over 500 species before and following identical fertilisation and fencing treatments at 39 grassland sites worldwide. Strong positive covariance in species responses to both treatments provided support for a growth-defence trade-off: populations that increased with the removal of nutrient limitation (poor competitors) also increased following removal of consumers. This result held globally across 4 years within plant life-history groups and within the majority of individual sites. Thus, a growth-defence trade-off appears to be the norm, and mechanisms maintaining grassland biodiversity may operate within this constraint.
Resumo:
Indicators of mitochondrial function were studied in two different cell culture models of cis-diamminedichloroplatinum-II (CDDP) resistance: the intrinsically resistant human ovarian cancer cell line CI-80-13S, and resistant clones (HeLa-S1a and HeLa-S1b) generated by stable expression of the serine protease inhibitor—plasminogen activator inhibitor type-2 (PAI-2), in the human cervical cancer cell line HeLa. In both models, CDDP resistance was associated with sensitivity to killing by adriamycin, etoposide, auranofin, bis[1,2-bis(diphenylphosphino)ethane]gold(I) chloride {[Au(DPPE)2]Cl}, CdCl2 and the mitochondrial inhibitors rhodamine-123 (Rhl23), dequalinium chloride (DeCH), tetraphenylphosphonium (TPP), and ethidium bromide (EtBr) and with lower constitutive levels of ATP. Unlike the HeLa clones, CI-80-13S cells were additionally sensitive to chloramphenicol, 1-methyl-4-phenylpyridinium ion (MPP+), rotenone, thenoyltrifluoroacetone (TTFA), and antimycin A, and showed poor reduction of 1-[4,5-dimethylthiazol-2-yl]-2,5-diphenyltetrazolium bromide (MTT), suggesting a deficiency in NADH dehydrogenase and/or succinate dehydrogenase activities. Total platinum uptake and DNA-bound platinum were slightly lower in CI-80-13S than in sensitive cells. The HeLa-S1a and HeLa-S1b clones, on the other hand, showed poor reduction of triphenyltetrazolium chloride (TTC), indicative of low cytochrome c oxidase activity. Total platinum uptake by HeLa-S1a was similar to HeLa, but DNA-bound platinum was much lower than for the parent cell line. The mitochondria of CI-80-13S and HeLa-S1a showed altered morphology and were fewer in number than those of JAM and HeLa. In both models, CDDP resistance was associated with less platinum accumulation and with mitochondrial and membrane defects, brought about one case with expression of a protease inhibitor which is implicated in tumor progression. Such markers may identify tumors suitable for treatment with gold phosphine complexes or other mitochondrial inhibitors.
Resumo:
The CO2-methane reformation reaction over Ni/SiO2 catalysts has been extensively studied using a range of temperature-programmed techniques and characterisation of the catalysts by thermogravimetry (TG), X-ray diffraction (XRD) and electron microscopy (TEM). The results indicate a strong correlation between the microstructure of the catalyst and its performance. The role of both CO2 and CH4 in the reaction has been investigated and the role of methyl radicals in the reaction mechanism highlighted. A reaction mechanism involving dissociatively adsorbed CO2 and methyl radicals has been proposed.
Resumo:
It is exciting to be living at a time when the big questions in biology can be investigated using modern genetics and computing [1]. Bauzà-Ribot et al.[2] take on one of the fundamental drivers of biodiversity, the effect of continental drift in the formation of the world’s biota 3 and 4, employing next-generation sequencing of whole mitochondrial genomes and modern Bayesian relaxed molecular clock analysis. Bauzà-Ribot et al.[2] conclude that vicariance via plate tectonics best explains the genetic divergence between subterranean metacrangonyctid amphipods currently found on islands separated by the Atlantic Ocean. This finding is a big deal in biogeography, and science generally [3], as many other presumed biotic tectonic divergences have been explained as probably due to more recent transoceanic dispersal events [4]. However, molecular clocks can be problematic 5 and 6 and we have identified three issues with the analyses of Bauzà-Ribot et al.[2] that cast serious doubt on their results and conclusions. When we reanalyzed their mitochondrial data and attempted to account for problems with calibration 5 and 6, modeling rates across branches 5 and 7 and substitution saturation [5], we inferred a much younger date for their key node. This implies either a later trans-Atlantic dispersal of these crustaceans, or more likely a series of later invasions of freshwaters from a common marine ancestor, but either way probably not ancient tectonic plate movements.
Resumo:
In Australia, and elsewhere, the movement of trains on long-haul rail networks is usually planned in advance. Typically, a train plan is developed to confirm that the required train movements and track maintenance activities can occur. The plan specifies when track segments will be occupied by particular trains and maintenance activities. On the day of operation, a train controller monitors and controls the movement of trains and maintenance crews, and updates the train plan in response to unplanned disruptions. It can be difficult to predict how good a plan will be in practice. The main performance indicator for a train service should be reliability - the proportion of trains running the service that complete at or before the scheduled time. We define the robustness of a planned train service to be the expected reliability. The robustness of individual train services and for a train plan as a whole can be estimated by simulating the train plan many times with random, but realistic, perturbations to train departure times and segment durations, and then analysing the distributions of arrival times. This process can also be used to set arrival times that will achieve a desired level of robustness for each train service.
Resumo:
The higher education sector in Australia is under increasing pressure to prove quality and efficacy of education provision, including graduate outcomes. One of the central tasks of higher education has become to prepare nascent professionals as far as possible for initial employment and future working lives beyond this (Boden & Nedeva, 2010). Tertiary educators in the creative arts face significant and distinctive challenges in demonstrating graduate employability, and creative graduates consistently have the poorest outcomes of any subject grouping. In part, this is because the national graduate destinations survey (Graduate Careers Council of Australia, 2012) does not cater to the distinctive ‘portfolio’ nature of creative careers, or take account of the fact that creative careers can take concerted effort over several years to establish (e.g., McCowan & Wyganowska, 2010). However, it is worth asking whether we as tertiary arts educators are doing enough to prepare creative arts students for the world of work, particularly given that the majority of them will be self-employed to some degree (Bureau of Labour Statistics, 2011, Throsby & Zednik, 2010), and will be challenged to build their own careers without recourse to the support of HR departments or intra-firm promotion schemes. It has been demonstrated empirically that career management and creative enterprise skills are among the most important graduate capabilities in determining early creative career success (Bridgstock, 2011), although these skills do not appear in the Learning and Teaching Academic Standards for the Creative and Performing Arts (2010). This paper explores the nature and development of enterprise capabilities for creative arts students (as distinct from students of the business school), examines best practice in the field internationally, and proposes a theoretically-driven creative arts-specific enterprise curriculum model which commences in first year, for demonstrable impact on student enterprise behaviours (such as grant seeking, professional networking and intention to start an enterprise) and employability.
Resumo:
A new simple test method using small scale models has been developed for testing profiled steel cladding systems under wind uplift/suction forces. This simple method should replace the large scale test method using two-span claddings used at present. It can be used for roof or wall cladding systems fastened with screw fasteners at crests or valleys.
Resumo:
Solar keratoses affect approximately 50% of Australian Caucasians aged over 40 y. Solar keratoses can undergo malignant transformation into squamous cell carcinoma followed by possible metastasis and are risk factors for basal cell carcinoma, melanoma, and squamous cell carcinoma. The glutathione-S-transferase genes play a part in detoxification of carcinogens and mutagens, including some produced by ultraviolet radiation. This study examined the role of glutathione-S-transferase M1, T1, P1, and Z1 gene polymorphisms in susceptibility to solar keratoses development. Using DNA samples from volunteers involved in the Nambour Skin Cancer Prevention Trial, allele and genotype frequencies were determined using polymerase chain reaction and restriction enzyme digestion. No significant differences were detected in glutathione-S-transferase P1 and glutathione-S-transferase Z1 allele or genotype frequencies; however, a significant association between glutathione-S-transferase M1 genotypes and solar keratoses development was detected (p=0.003) with null individuals having an approximate 2-fold increase in risk for solar keratoses development (odds ratio: 2.1; confidence interval: 1.3-3.5) and a significantly higher increase in risk in conjunction with high outdoor exposure (odds ratio: 3.4; confidence interval: 1.9-6.3). Also, a difference in glutathione-S-transferase T1 genotype frequencies was detected (p=0.039), although considering that multiple testing was undertaken, this was found not to be significant. Fair skin and inability to tan were found to be highly significant risk factors for solar keratoses development with odds ratios of 18.5 (confidence interval: 5.7-59.9) and 7.4 (confidence interval: 2.6-21.0), respectively. Overall, glutathione-S-transferase M1 conferred a significant increase in risk of solar keratoses development, particularly in the presence of high outdoor exposure and synergistically with known phenotypic risk factors of fair skin and inability to tan.
Resumo:
MC1R gene variants have previously been associated with red hair and fair skin color, moreover skin ultraviolet sensitivity and a strong association with melanoma has been demonstrated for three variant alleles that are active in influencing pigmentation: Arg151Cys, Arg160Trp, and Asp294His. This study has confirmed these pigmentary associations with MC1R genotype in a collection of 220 individuals drawn from the Nambour community in Queensland, Australia, 111 of whom were at high risk and 109 at low risk of basal cell carcinoma and squamous cell carcinoma. Comparative allele frequencies for nine MC1R variants that have been reported in the Caucasian population were determined for these two groups, and an association between prevalence of basal cell carcinoma, squamous cell carcinoma, solar keratosis and the same three active MC1R variant alleles was demonstrated [odds ratio = 3.15 95% CI (1.7, 5.82)]. Three other commonly occurring variant alleles: Val60Leu, Val92Met, and Arg163Gln were identified as having a minimal impact on pigmentation phenotype as well as basal cell carcinoma and squamous cell carcinoma risk. A significant heterozygote effect was demonstrated where individuals carrying a single MC1R variant allele were more likely to have fair and sun sensitive skin as well as carriage of a solar lesion when compared with those individuals with a consensus MC1R genotype. After adjusting for the effects of pigmentation on the association between MC1R variant alleles and basal cell carcinoma and squamous cell carcinoma risk, the association persisted, confirming that presence of at least one variant allele remains informative in terms of predicting risk for developing a solar-induced skin lesion beyond that information wained through observation of pigmentation phenotype.
Resumo:
AIM: To document and compare current practice in nutrition assessment of Parkinson’s disease by dietitians in Australia and Canada in order to identify priority areas for review and development of practice guidelines and direct future research. METHODS: An online survey was distributed to DAA members and PEN subscribers through their email newsletters. The survey captured current practice in the phases of the Nutrition Care Plan. The results of the assessment phase are presented here. RESULTS: Eighty-four dietitians responded. Differences in practice existed in the choice of nutrition screening and assessment tools, including appropriate BMI ranges. Nutrition impact symptoms were commonly assessed, but information about Parkinson’s disease medication interactions were not consistently assessed. CONCLUSIONS: he variation in practice related to the use of screening and assessment methods may result in the identification of different goals for subsequent interventions. Even more practice variation was evident for those items more specific to Parkinson’s disease and may be due to the lack of evidence to guide practice. Further research is required to support decisions for nutrition assessment of Parkinson’s disease.
Resumo:
Aim This study aimed to demonstrate how supervisors and students use their time during the three domains of nutrition and dietetic clinical placement and to what extent patient care and non-patient activities change during placement compared to pre- and post- placement. Methods A cohort survey design was used with students from two Queensland universities, and their supervisors in 2010. Participants recorded their time use in either a paper-based or an electronic survey. Supervisors’ and students’ time-use was calculated as independent daily means according to time use categories reported over the length of the placement. Mean daily number of occasions of service, length of occasions of service, project and other time use in minutes was reported as productivity output indicators and the data imputed. A linear mixed modelling approach was used to describe the relationship between the stage of placement and time use in minutes. Results Combined students’ (n= 21) and supervisors’ (n=29) time use as occasions of service or length of occasions of service in patient care activities were significantly different pre, during and post placement. On project-based placements in food service management and community public health nutrition, supervisors’ project activity time significantly decreased during placements with students undertaking more time in project activities. Conclusions This study showed students do not reduce occasions of service in patient care and they enhance project activities in food service and community public health nutrition while on placement. A larger study is required to confirm these results.
Resumo:
The International Classification of Diseases, Version 10, Australian modification (ICD-10- AM) is commonly used to classify diseases in hospital patients. ICD-10-AM defines malnutrition as “BMI < 18.5 kg/m2 or unintentional weight loss of ≥ 5% with evidence of suboptimal intake resulting in subcutaneous fat loss and/or muscle wasting”. The Australasian Nutrition Care Day Survey (ANCDS) is the most comprehensive survey to evaluate malnutrition prevalence in acute care patients from Australian and New Zealand hospitals1. This study determined if malnourished participants were assigned malnutritionrelated codes as per ICD-10-AM. The ANCDS recruited acute care patients from 56 hospitals. Hospital-based dietitians evaluated participants’ nutritional status using BMI and Subjective Global Assessment (SGA). In keeping with the ICD-10-AM definition, malnutrition was defined as BMI <18.5kg/m2, SGA-B (moderately malnourished) or SGA-C (severely malnourished). After three months, in this prospective cohort study, hospitals’ health information/medical records department provided coding results for malnourished participants. Although malnutrition was prevalent in 32% (n= 993) of the cohort (N= 3122), a significantly small number were coded for malnutrition (n= 162, 16%, p<0.001). In 21 hospitals, none of the malnourished participants were coded. This is the largest study to provide a snapshot of malnutrition-coding in Australian and New Zealand hospitals. Findings highlight gaps in malnutrition documentation and/or subsequent coding, which could potentially result in significant loss of casemix-related revenue for hospitals. Dietitians must lead the way in developing structured processes for malnutrition identification, documentation and coding.
Resumo:
The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients in Australian and New Zealand hospitals consume ≤50% of the offered food. The ANCDS found a significant association between poor food intake and increased in-hospital mortality after controlling for confounders (nutritional status, age, disease type and severity)1. Evidence for the effectiveness of medical nutrition therapy (MNT) in hospital patients eating poorly is lacking. An exploratory study was conducted in respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, 24-hour food intake (0%, 25%, 50%, 75%, 100% of offered meals) was evaluated for patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT with food intake re-evaluated on day-7. 184 patients were observed over four weeks. Sixty-two patients (34%) consumed ≤50% of the offered meals. Simple interventions (feeding/menu assistance, diet texture modifications) improved intake to ≥75% in 30 patients who did not require further MNT. Of the 32 patients referred for MNT, baseline and day-7 data were available for 20 patients (68±17years, 65% females, BMI: 22±5kg/m2, median energy, protein intake: 2250kJ, 25g respectively). On day-7, 17 participants (85%) demonstrated significantly higher consumption (4300kJ, 53g; p<0.01). Three participants demonstrated no improvement due to ongoing nutrition-impact symptoms. “Percentage food intake” was a quick tool to identify patients in whom simple interventions could enhance intake. MNT was associated with improved dietary intake in hospital patients. Further research is needed to establish a causal relationship.
Resumo:
Australian authorities have set ambitious policy objectives to shift Australia’s current transport profile of heavy reliance on private motor cars to sustainable modes. Improving accessibility of public transport is a central component of that objective. Past studies on accessibility to public transport focus on walking time and/or waiting time. However, travellers’ perceptions of the interface leg journeys may depend not only on these direct and tangible factors but also on social and psychological factors. This paper extends previous research that identified five salient perspectives of rail access by means of a statement sorting activity and cluster analysis with a small sample of rail passengers in three Australian cities (Zuniga et al, 2013). This study collects a new data set including 144 responses from Brisbane and Melbourne to an online survey made up of a Likert-scaled statement sorting exercise and questionnaire. It employs factor analysis to examine the statement rankings and uncovers seven underlying factors in the exploratory manner, i.e., station, safety, access, transfer, service attitude, traveler’s physical activity levels, and environmental concern. Respondents from groups stratified by rail use frequency are compared in terms of their scores of those factors. Findings from this study indicate a need to re-conceptualize accessibility to intra-urban rail travel in agreement with current policy agenda, and to target behavioral intervention to multiple dimensions of accessibility influencing passengers’ travel choices. Arguments in this paper are not limited to intra-urban rail transit, but may also be relevant to public transport in general.