235 resultados para 68-502C
Resumo:
Details of a project which fictionalises the oral history of the life of the author's polio-afflicted grandmother Beth Bevan and her experiences at a home for children with disabilities are presented. The speech and language patterns recognised in the first person narration are described, as also the sense of voice and identity communicated through the oral history.
Resumo:
User-Web interactions have emerged as an important area of research in the field of information science. In this study, we investigate the effects of users’ cognitive styles on their Web navigational styles and information processing strategies. We report results from the analyses of 594 minutes recorded Web search sessions of 18 participants engaged in 54 scenario-based search tasks. We use questionnaires, cognitive style test, Web session logs and think-aloud as the data collection instruments. We classify users’ cognitive styles as verbalisers and imagers based on Riding’s (1991) Cognitive Style Analysis test. Two classifications of navigational styles and three categories of information processing strategies are identified. Our study findings show that there exist relationships between users’ cognitive style, and their navigational styles and information processing strategies. Verbal users seem to display sporadic navigational styles, and adopt a scanning strategy to understand the content of the search result page, while imagery users follow a structured navigational style and reading approach. We develop a matrix and a model that depicts the relationships between users’ cognitive styles, and their navigational style and information processing strategies. We discuss how the findings from this study could help search engine designers to provide an adaptive navigation support to users.
Resumo:
Objective: The aim of this literature review is to identify the role of probiotics in the management of enteral tube feeding (ETF) diarrhoea in critically ill patients.---------- Background: Diarrhoea is a common gastrointestinal problem seen in ETF patients. The incidence of diarrhoea in tube fed patients varies from 2% to 68% across all patients. Despite extensive investigation, the pathogenesis surrounding ETF diarrhoea remains unclear. Evidence to support probiotics to manage ETF diarrhoea in critically ill patients remains sparse.---------- Method: Literature on ETF diarrhoea and probiotics in critically ill, adult patients was reviewed from 1980 to 2010. The Cochrane Library, Pubmed, Science Direct, Medline and the Cumulative Index of Nursing and Allied Health Literature (CINAHL) electronic databases were searched using specific inclusion/exclusion criteria. Key search terms used were: enteral nutrition, diarrhoea, critical illness, probiotics, probiotic species and randomised clinical control trial (RCT).---------- Results: Four RCT papers were identified with two reporting full studies, one reporting a pilot RCT and one conference abstract reporting an RCT pilot study. A trend towards a reduction in diarrhoea incidence was observed in the probiotic groups. However, mortality associated with probiotic use in some severely and critically ill patients must caution the clinician against its use.---------- Conclusion: Evidence to support probiotic use in the management of ETF diarrhoea in critically ill patients remains unclear. This paper argues that probiotics should not be administered to critically ill patients until further research has been conducted to examine the causal relationship between probiotics and mortality, irrespective of the patient's disease state or projected prophylactic benefit of probiotic administration.
Resumo:
The shift from 20th century mass communications media towards convergent media and Web 2.0 has raised the possibility of a renaissance of the public sphere, based around citizen journalism and participatory media culture. This paper will evaluate such claims both conceptually and empirically. At a conceptual level, it is noted that the question of whether media democratization is occurring depends in part upon how democracy is understood, with some critical differences in understandings of democracy, the public sphere and media citizenship. The empirical work in this paper draws upon various case studies of new developments in Australian media, including online- only newspapers, developments in public service media, and the rise of commercially based online alternative media. It is argued that participatory media culture is being expanded if understood in terms of media pluralism, but that implications for the public sphere depend in part upon how media democratization is defined.
Resumo:
OBJECTIVE: The accurate quantification of human diabetic neuropathy is important to define at-risk patients, anticipate deterioration, and assess new therapies. ---------- RESEARCH DESIGN AND METHODS: A total of 101 diabetic patients and 17 age-matched control subjects underwent neurological evaluation, neurophysiology tests, quantitative sensory testing, and evaluation of corneal sensation and corneal nerve morphology using corneal confocal microscopy (CCM). ---------- RESULTS: Corneal sensation decreased significantly (P = 0.0001) with increasing neuropathic severity and correlated with the neuropathy disability score (NDS) (r = 0.441, P < 0.0001). Corneal nerve fiber density (NFD) (P < 0.0001), nerve fiber length (NFL), (P < 0.0001), and nerve branch density (NBD) (P < 0.0001) decreased significantly with increasing neuropathic severity and correlated with NDS (NFD r = −0.475, P < 0.0001; NBD r = −0.511, P < 0.0001; and NFL r = −0.581, P < 0.0001). NBD and NFL demonstrated a significant and progressive reduction with worsening heat pain thresholds (P = 0.01). Receiver operating characteristic curve analysis for the diagnosis of neuropathy (NDS >3) defined an NFD of <27.8/mm2 with a sensitivity of 0.82 (95% CI 0.68–0.92) and specificity of 0.52 (0.40–0.64) and for detecting patients at risk of foot ulceration (NDS >6) defined a NFD cutoff of <20.8/mm2 with a sensitivity of 0.71 (0.42–0.92) and specificity of 0.64 (0.54–0.74). ---------- CONCLUSIONS: CCM is a noninvasive clinical technique that may be used to detect early nerve damage and stratify diabetic patients with increasing neuropathic severity. Established diabetic neuropathy leads to pain and foot ulceration. Detecting neuropathy early may allow intervention with treatments to slow or reverse this condition (1). Recent studies suggested that small unmyelinated C-fibers are damaged early in diabetic neuropathy (2–4) but can only be detected using invasive procedures such as sural nerve biopsy (4,5) or skin-punch biopsy (6–8). Our studies have shown that corneal confocal microscopy (CCM) can identify early small nerve fiber damage and accurately quantify the severity of diabetic neuropathy (9–11). We have also shown that CCM relates to intraepidermal nerve fiber loss (12) and a reduction in corneal sensitivity (13) and detects early nerve fiber regeneration after pancreas transplantation (14). Recently we have also shown that CCM detects nerve fiber damage in patients with Fabry disease (15) and idiopathic small fiber neuropathy (16) when results of electrophysiology tests and quantitative sensory testing (QST) are normal. In this study we assessed corneal sensitivity and corneal nerve morphology using CCM in diabetic patients stratified for the severity of diabetic neuropathy using neurological evaluation, electrophysiology tests, and QST. This enabled us to compare CCM and corneal esthesiometry with established tests of diabetic neuropathy and define their sensitivity and specificity to detect diabetic patients with early neuropathy and those at risk of foot ulceration.
Resumo:
Background and Significance Venous leg ulcers are a significant cause of chronic ill-health for 1–3% of those aged over 60 years, increasing in incidence with age. The condition is difficult and costly to heal, consuming 1–2.5% of total health budgets in developed countries and up to 50% of community nursing time. Unfortunately after healing, there is a recurrence rate of 60 to 70%, frequently within the first 12 months after heaing. Although some risk factors associated with higher recurrence rates have been identified (e.g. prolonged ulcer duration, deep vein thrombosis), in general there is limited evidence on treatments to effectively prevent recurrence. Patients are generally advised to undertake activities which aim to improve the impaired venous return (e.g. compression therapy, leg elevation, exercise). However, only compression therapy has some evidence to support its effectiveness in prevention and problems with adherence to this strategy are well documented. Aim The aim of this research was to identify factors associated with recurrence by determining relationships between recurrence and demographic factors, health, physical activity, psychosocial factors and self-care activities to prevent recurrence. Methods Two studies were undertaken: a retrospective study of participants diagnosed with a venous leg ulcer which healed 12 to 36 months prior to the study (n=122); and a prospective longitudinal study of participants recruited as their ulcer healed and data collected for 12 months following healing (n=80). Data were collected from medical records on demographics, medical history and ulcer history and treatments; and from self-report questionnaires on physical activity, nutrition, psychosocial measures, ulcer history, compression and other self-care activities. Follow-up data for the prospective study were collected every three months for 12 months after healing. For the retrospective study, a logistic regression model determined the independent influences of variables on recurrence. For the prospective study, median time to recurrence was calculated using the Kaplan-Meier method and a Cox proportional-hazards regression model was used to adjust for potential confounders and determine effects of preventive strategies and psychosocial factors on recurrence. Results In total, 68% of participants in the retrospective study and 44% of participants in the prospective study suffered a recurrence. After mutual adjustment for all variables in multivariable regression models, leg elevation, compression therapy, self efficacy and physical activity were found to be consistently related to recurrence in both studies. In the retrospective study, leg elevation, wearing Class 2 or 3 compression hosiery, the level of physical activity, cardiac disease and self efficacy scores remained significantly associated (p<0.05) with recurrence. The model was significant (p <0.001); with a R2 equivalent of 0.62. Examination of relationships between psychosocial factors and adherence to wearing compression hosiery found wearing compression hosiery was significantly positively associated with participants’ knowledge of the cause of their condition (p=0.002), higher self-efficacy scores (p=0.026) and lower depression scores (p=0.009). Analysis of data from the prospective study found there were 35 recurrences (44%) in the 12 months following healing and median time to recurrence was 27 weeks. After adjustment for potential confounders, a Cox proportional hazards regression model found that at least an hour/day of leg elevation, six or more days/week in Class 2 (20–25mmHg) or 3 (30–40mmHg) compression hosiery, higher social support scale scores and higher General Self-Efficacy scores remained significantly associated (p<0.05) with a lower risk of recurrence, while male gender and a history of DVT remained significant risk factors for recurrence. Overall the model was significant (p <0.001); with an R2 equivalent 0.72. Conclusions The high rates of recurrence found in the studies highlight the urgent need for further information in this area to support development of effective strategies for prevention. Overall, results indicate leg elevation, physical activity, compression hosiery and strategies to improve self-efficacy are likely to prevent recurrence. In addition, optimal management of depression and strategies to improve patient knowledge and self-efficacy may positively influence adherence to compression therapy. This research provides important information for development of strategies to prevent recurrence of venous leg ulcers, with the potential to improve health and decrease health care costs in this population.
Resumo:
Objective The Active Australia Survey (AAS) is used for physical activity (PA) surveillance in the general Australian adult population, but its validity in older adults has not been evaluated. Our aim was to examine the convergent validity of the AAS questions in older adults. Design The AAS was validated against pedometer step counts as an objective measure of PA, self-reported physical function, and a step-test to assess cardiorespiratory fitness. Method Participants were community-dwelling adults, aged 65-89 y, with the ability to walk 100 m. They completed a self-administered AAS and the step-test in one interview. One week earlier, they completed the Short Form-36 physical function subscale. Between these two interviews, they each wore a YAMAX Digiwalker SW200 pedometer and recorded daily steps. Using the AAS data, daily walking minutes and total PA minutes (walking, moderate-intensity PA and vigorous-intensity PA) were compared with the validity measures using Spearman rank-order correlations. Fifty-three adults completed the study. Results Median daily walking minutes were 34.2 (interquartile range [IQR] 17.1, 60.0), and median daily total PA minutes were 68.6 (IQR 31.4, 113.6). Walking and total PA minutes were both moderately correlated with pedometer steps (Spearman correlation r=0.42, p=0.003, for each) but not with step-test seconds to completion (r=-0.11, p=0.44; r=-0.25, p=0.08, respectively). Total PA minutes were significantly correlated with physical function scores (r=0.39, p=0.004), but walking minutes were not (r=0.15, p=0.29). Conclusions This initial examination of the psychometric properties of the AAS for older adults suggests that this surveillance tool has acceptable convergent validity for ambulatory, community-dwelling older adults.
Resumo:
Purpose: The aim of this study was to determine current approaches adopted by optometrists to the recording of corneal staining following fluorescein instillation. Methods: An anonymous ‘record-keeping task’ was sent to all 756 practitioners who are members of the Queensland Division of Optometrists Association Australia. This task comprised a form on which appeared a colour photograph depicting contact lens solution-induced corneal staining. Next to the photograph was an empty box, in which practitioners were asked to record their observations. Practitioners were also asked to indicate the level of severity of the condition at which treatment would be instigated. Results: Completed task forms were returned by 228 optometrists, representing a 30 per cent response rate. Ninety-two per cent of respondents offered a diagnosis. The most commonly used descriptive terms were ‘superficial punctate keratitis’ (36 per cent of respondents) and ‘punctate staining’ (29 per cent). The level of severity and location of corneal staining were noted by 69 and 68 per cent of respondents, respectively. A numerical grade was assigned by 44 per cent of respondents. Only three per cent nominated the grading scale used. The standard deviation of assigned grades was � 0.6. The condition was sketched by 35 per cent of respondents and two per cent stated that they would take a photograph of the eye. Ten per cent noted the eye in which the condition was being observed. Opinions of the level of severity at which treatment for corneal staining should be instigated varied considerably between practitioners, ranging from ‘any sign of corneal staining’ to ‘grade 4 staining’. Conclusion: Although most practitioners made a sensible note of the condition and properly recorded the location of corneal staining, serious deficiencies were evident regarding other aspects of record-keeping. Ongoing programs of professional optometric education should reinforce good practice in relation to clinical record-keeping.
Resumo:
Most research on numerical development in children is behavioural, focusing on accuracy and response time in different problem formats. However, Temple and Posner (1998) used ERPs and the numerical distance task with 5-year-olds to show that the development of numerical representations is difficult to disentangle from the development of the executive components of response organization and execution. Here we use the numerical Stroop paradigm (NSP) and ERPs to study possible executive interference in numerical processing tasks in 6–8-year-old children. In the NSP, the numerical magnitude of the digits is task-relevant and the physical size of the digits is task-irrelevant. We show that younger children are highly susceptible to interference from irrelevant physical information such as digit size, but that access to the numerical representation is almost as fast in young children as in adults. We argue that the developmental trajectories for executive function and numerical processing may act together to determine numerical development in young children.
Resumo:
Rationale, aims and objectives: Patient preference for interventions aimed at preventing in-hospital falls has not previously been investigated. This study aims to contrast the amount patients are willing to pay to prevent falls through six intervention approaches. ----- ----- Methods: This was a cross-sectional willingness-to-pay (WTP), contingent valuation survey conducted among hospital inpatients (n = 125) during their first week on a geriatric rehabilitation unit in Queensland, Australia. Contingent valuation scenarios were constructed for six falls prevention interventions: a falls consultation, an exercise programme, a face-to-face education programme, a booklet and video education programme, hip protectors and a targeted, multifactorial intervention programme. The benefit to participants in terms of reduction in risk of falls was held constant (30% risk reduction) within each scenario. ----- ----- Results: Participants valued the targeted, multifactorial intervention programme the highest [mean WTP (95% CI): $(AUD)268 ($240, $296)], followed by the falls consultation [$215 ($196, $234)], exercise [$174 ($156, $191)], face-to-face education [$164 ($146, $182)], hip protector [$74 ($62, $87)] and booklet and video education interventions [$68 ($57, $80)]. A ‘cost of provision’ bias was identified, which adversely affected the valuation of the booklet and video education intervention. ----- ----- Conclusion: There may be considerable indirect and intangible costs associated with interventions to prevent falls in hospitals that can substantially affect patient preferences. These costs could substantially influence the ability of these interventions to generate a net benefit in a cost–benefit analysis.
Resumo:
Skipjack (SJT) (Katsuwonus pelamis) is a medium sized, pelagic, highly dispersive tuna species that occurs widely across tropical and subtropical waters. SJT constitute the largest tuna fishery in the Indian Ocean, and are currently managed as a single stock. Patterns of genetic variation in a mtDNA gene and 6 microsatellite loci were examined to test for stock structure in the northwestern Indian Ocean. 324 individuals were sampled from five major fishing grounds around Sri Lanka, and single sites in the Maldive Islands and the Laccadive Islands. Phylogenetic reconstruction of mtDNA revealed two coexisting divergent clades in the region. AMOVA (Analysis of Molecular Variance) of mtDNA data revealed significant genetic differentiation among sites (ΦST = 0.2029, P < 0.0001), also supported by SAMOVA results. AMOVA of microsatellite data also showed significant differentiation among most sampled sites (FST = 0.0256, P<0.001) consistent with the mtDNA pattern. STRUCTURE analysis of the microsatellite data revealed two differentiated stocks. While the both two marker types examined identified two genetic groups, microsatellite analysis indicates that the sampled SJT are likely to represent individuals sourced from discrete breeding grounds that are mixed in feeding grounds in Sri Lankan waters.
Resumo:
Current trends in workforce development indicate the movement of workers within and across occupations to be the norm. In 2009, only one in three vocational education and training (VET) graduates in Australia ended up working in an occupation for which they were trained. This implies that VET enhances the employability of its graduates by equipping them with the knowledge and competencies to work in different occupations and sectors. This paper presents findings from a Government-funded study that examined the occupational mobility of selected associate professional and trades occupations within the Aged Care, Automotive and Civil Construction sectors in Queensland. The study surveyed enrolled nurses and related workers, motor mechanics and civil construction workers to analyse their patterns of occupational mobility, future work intentions, reasons for taking and leaving work, and the factors influencing them to leave or remain in their occupations. This paper also discusses the implications of findings for the training of workers in these sectors and more generally.
Resumo:
Background Exercise for Health was a pragmatic, randomised, controlled trial comparing the effect of an eight-month exercise intervention on function, treatment-related side effects and quality of life following breast cancer, compared with usual care. The intervention commenced six weeks post-surgery, and two modes of delivering the same intervention was compared with usual care. The purpose of this paper is to describe the study design, along with outcomes related to recruitment, retention and representativeness, and intervention participation. Methods: Women newly diagnosed with breast cancer and residing in a major metropolitan city of Queensland, Australia, were eligible to participate. Consenting women were randomised to a face-to-face-delivered exercise group (FtF, n=67), telephone-delivered exercise group (Tel, n=67) or usual care group (UC, n=60) and were assessed pre-intervention (5-weeks post-surgery), mid-intervention (6 months post-surgery) and 10 weeks post-intervention (12 months post-surgery). Each intervention arm entailed 16 sessions with an Exercise Physiologist. Results: Of 318 potentially eligible women, 63% (n=200) agreed to participate, with a 12-month retention rate of 93%. Participants were similar to the Queensland breast cancer population with respect to disease characteristics, and the randomisation procedure was mostly successful at attaining group balance, with the few minor imbalances observed unlikely to influence intervention effects given balance in other related characteristics. Median participation was 14 (min, max: 0, 16) and 13 (min, max: 3, 16) intervention sessions for the FtF and Tel, respectively, with 68% of those in Tel and 82% in FtF participating in at least 75% of sessions. Discussion: Participation in both intervention arms during and following treatment for breast cancer was feasible and acceptable to women. Future work, designed to inform translation into practice, will evaluate the quality of life, clinical, psychosocial and behavioural outcomes associated with each mode of delivery.
Resumo:
Introduction: Emergency prehospital medical care providers are frontline health workers during emergencies. However, little is known about their attitudes, perceptions, and likely behaviors during emergency conditions. Understanding these attitudes and behaviors is crucial to mitigating the psychological and operational effects of biohazard events such as pandemic influenza, and will support the business continuity of essential prehospital services. ----- ----- Problem: This study was designed to investigate the association between knowledge and attitudes regarding avian influenza on likely behavioral responses of Australian emergency prehospital medical care providers in pandemic conditions. ----- ----- Methods: Using a reply-paid postal questionnaire, the knowledge and attitudes of a national, stratified, random sample of the Australian emergency prehospital medical care workforce in relation to pandemic influenza were investigated. In addition to knowledge and attitudes, there were five measures of anticipated behavior during pandemic conditions: (1) preparedness to wear personal protective equipment (PPE); (2) preparedness to change role; (3) willingness to work; and likely refusal to work with colleagues who were exposed to (4) known and (5) suspected influenza. Multiple logistic regression models were constructed to determine the independent predictors of each of the anticipated behaviors, while controlling for other relevant variables. ----- ----- Results: Almost half (43%) of the 725 emergency prehospital medical care personnel who responded to the survey indicated that they would be unwilling to work during pandemic conditions; one-quarter indicated that they would not be prepared to work in PPE; and one-third would refuse to work with a colleague exposed to a known case of pandemic human influenza. Willingness to work during a pandemic (OR = 1.41; 95% CI = 1.0–1.9), and willingness to change roles (OR = 1.44; 95% CI = 1.04–2.0) significantly increased with adequate knowledge about infectious agents generally. Generally, refusal to work with exposed (OR = 0.48; 95% CI = 0.3–0.7) or potentially exposed (OR = 0.43; 95% CI = 0.3–0.6) colleagues significantly decreased with adequate knowledge about infectious agents. Confidence in the employer’s capacity to respond appropriately to a pandemic significantly increased employee willingness to work (OR = 2.83; 95% CI = 1.9–4.1); willingness to change roles during a pandemic (OR = 1.52; 95% CI = 1.1–2.1); preparedness to wear PPE (OR = 1.68; 95% CI = 1.1–2.5); and significantly decreased the likelihood of refusing to work with colleagues exposed to (suspected) influenza (OR = 0.59; 95% CI = 0.4–0.9). ----- ----- Conclusions:These findings indicate that education and training alone will not adequately prepare the emergency prehospital medical workforce for a pandemic. It is crucial to address the concerns of ambulance personnel and the perceived concerns of their relationship with partners in order to maintain an effective prehospital emergency medical care service during pandemic conditions.
Resumo:
Adequate blood supply and sufficient mechanical stability are necessary for timely fracture healing. Damage to vessels impairs blood supply; hindering the transport of oxygen which is an essential metabolite for cells involved in repair. The degree of mechanical stability determines the mechanical conditions in the healing tissues. The mechanical conditions can influence tissue differentiation and may also inhibit revascularization. Knowledge of the actual conditions in a healing fracture in vivo is extremely limited. This study aimed to quantify the pressure, oxygen tension and temperature in the external callus during the early phase of bone healing. Six Merino-mix sheep underwent a tibial osteotomy. The tibia was stabilized with a standard mono-lateral external fixator. A multi-parameter catheter was placed adjacent to the osteotomy gap on the medial aspect of the tibia. Measurements of oxygen tension and temperature were performed for ten days post-op. Measurements of pressure were performed during gait on days three and seven. The ground reaction force and the interfragmentary movements were measured simultaneously. The maximum pressure during gait increased (p=0.028) from three (41.3 [29.2-44.1] mm Hg) to seven days (71.8 [61.8-84.8] mm Hg). During the same interval, there was no change (p=0.92) in the peak ground reaction force or in the interfragmentary movement (compression: p=0.59 and axial rotation: p=0.11). Oxygen tension in the haematoma (74.1 mm Hg [68.6-78.5]) was initially high post-op and decreased steadily over the first five days. The temperature increased over the first four days before reaching a plateau at approximately 38.5 degrees C on day four. This study is the first to report pressure, oxygen tension and temperature in the early callus tissues. The magnitude of pressure increased even though weight bearing and IFM remained unchanged. Oxygen tensions were initially high in the haematoma and fell gradually with a low oxygen environment first established after four to five days. This study illustrates that in bone healing the local environment for cells may not be considered constant with regard to oxygen tension, pressure and temperature.