824 resultados para Subjective arbitrability
Resumo:
Postgraduate candidates in the creative arts encounter unique challenges when writing an exegesis (the written document that accompanies creative work as a thesis). As a practitioner-researcher, they must adopt a dual perspective–looking out towards an established field of research, exemplars and theories, as well as inwards towards their experiential creative processes and practice. This dual orientation provides clear benefits, for it enables them to situate the research within its field and make objective claims for the research methodologies and outcomes while maintaining an intimate, voiced relationship with the practice. However, a dual orientation introduces considerable complexities in the writing. It requires a reconciliation of multi-perspectival subject positions: the disinterested academic posture of the observer/ethnographer/analyst/theorist at times; and the invested, subjective stance the practitioner/producer at others. It requires the author to negotiate a range of writing styles and speech genres–from the formal, polemical style of the theorist to the personal, questioning and emotive voice of reflexivity. Moreover, these multi-variant orientations, subject positions, styles and voices must be integrated into a unified and coherent text. In this chapter I offer a conceptual framework and strategies for approaching this relatively new genre of thesis. I begin by summarizing the characteristics of what has begun to emerge as the predominant model of exegesis (the dual-oriented ‘Connective’ exegesis). Framing it against theoretical and philosophical understandings of polyvocality and matrixicality, I go on to point to recent textual models that provide precedents for connecting differently oriented perspectives, subjectivities and voices. I then turn to emergent archives of practice-led research to explain how the challenge of writing a ‘Connective’ exegesis has so far been resolved by higher degree research (HDR) candidates. Exemplars illustrate a range of strategies they have used to compose a multi-perspectival text, reconcile the divergent subject positions of the practitioner researcher, and harmonize the speech genres of a ployvocal text.
Resumo:
We support Shane and Venkataraman’s (2000) basic idea of an “entrepreneurship nexus” where characteristics of the actor as well as those of the “opportunity” they work on influence action and outcomes in the creation of new economic activities. However, a review of the literature reveals that very little progress has been made on the core issues pertaining to the nexus idea. We argue that this is rooted in fundamental and insurmountable problems with the “opportunity” construct itself. As an alternative, we suggest the admittedly subjective notion of New Venture Idea as a more workable alternative. We provide a comprehensive definition and explanation of this construct, and take steps towards improved conceptualization and operationalization of its subdimensions. With some further work on these conceptualizations and operationalizations it should be possible to implement a comprehensive research program that can finally deliver on the promise outlined by Shane and Venkataraman (2000).
Individual variability in compensatory eating following acute exercise in overweight and obese women
Resumo:
Background While compensatory eating following acute aerobic exercise is highly variable, little is known about the underling mechanisms that contribute to alterations in exercise-induced eating behaviour. Methods Overweight and obese women (BMI = 29.6 ± 4.0kg.m2) performed a bout of cycling individually tailored to expend 400kcal (EX), or a time-matched no exercise control condition in a randomised, counter-balanced order. Sixty minutes after the cessation of exercise, an ad libitum test meal was provided. Substrate oxidation and subjective appetite ratings were measured during exercise/time-matched rest, and during the period between the cessation of exercise and food consumption. Results While ad libitum EI did not differ between EX and the control condition (666.0 ± 203.9kcal vs. 664.6 ± 174.4kcal, respectively; ns), there was marked individual variability in compensatory energy intake (EI). The difference in EI between EX and the control condition ranged from -234.3 to +278.5kcal. Carbohydrate oxidation during exercise was positively associated with post-exercise EI, accounting for 37% of the variance in EI (r = 0.57; p = 0.02). Conclusions These data indicate that capacity of acute exercise to create a short-term energy deficit in overweight and obese women is highly variable. Furthermore, exercise-induced CHO oxidation can explain part of the variability in acute exercise-induced compensatory eating. Post-exercise compensatory eating could serve as an adaptive response to facilitate the restoration of carbohydrate balance.
Resumo:
We review and discuss recent developments in best–worst scaling (BWS) that allow researchers to measure items or objects on measurement scales with known properties. We note that BWS has some distinct advantages compared with other measurement approaches, such as category rating scales or paired comparisons. We demonstrate how to use BWS to measure subjective quantities in two different empirical examples. One of these measures preferences for weekend getaways and requires comparing relatively few objects; a second measures academics' perceptions of the quality of academic marketing journals and requires comparing a significantly large set of objects. We conclude by discussing some limitations and future research opportunities related to BWS.
Resumo:
The International Classification of Diseases, Version 10, Australian modification (ICD-10- AM) is commonly used to classify diseases in hospital patients. ICD-10-AM defines malnutrition as “BMI < 18.5 kg/m2 or unintentional weight loss of ≥ 5% with evidence of suboptimal intake resulting in subcutaneous fat loss and/or muscle wasting”. The Australasian Nutrition Care Day Survey (ANCDS) is the most comprehensive survey to evaluate malnutrition prevalence in acute care patients from Australian and New Zealand hospitals1. This study determined if malnourished participants were assigned malnutritionrelated codes as per ICD-10-AM. The ANCDS recruited acute care patients from 56 hospitals. Hospital-based dietitians evaluated participants’ nutritional status using BMI and Subjective Global Assessment (SGA). In keeping with the ICD-10-AM definition, malnutrition was defined as BMI <18.5kg/m2, SGA-B (moderately malnourished) or SGA-C (severely malnourished). After three months, in this prospective cohort study, hospitals’ health information/medical records department provided coding results for malnourished participants. Although malnutrition was prevalent in 32% (n= 993) of the cohort (N= 3122), a significantly small number were coded for malnutrition (n= 162, 16%, p<0.001). In 21 hospitals, none of the malnourished participants were coded. This is the largest study to provide a snapshot of malnutrition-coding in Australian and New Zealand hospitals. Findings highlight gaps in malnutrition documentation and/or subsequent coding, which could potentially result in significant loss of casemix-related revenue for hospitals. Dietitians must lead the way in developing structured processes for malnutrition identification, documentation and coding.
Resumo:
Background: Nutrition screening is usually administered by nurses. However, most studies on nutrition screening tools have not used nurses to validate the tools. The 3-Minute Nutrition Screening (3-MinNS) assesses weight loss, dietary intake and muscle wastage, with the composite score of each used to determine risk of malnutrition. The aim of the study was to determine the validity and reliability of 3-MinNS administered by nurses, who are the intended assessors. Methods: In this cross sectional study, three ward-based nurses screened 121 patients aged 21 years and over using 3-MinNS in three wards within 24 hours of admission. A dietitian then assessed the patients’ nutritional status using Subjective Global Assessment within 48 hours of admission, whilst blinded to the results of the screening. To assess the reliability of 3-MinNS, 37 patients screened by the first nurse were re-screened by a second nurse within 24 hours, who was blinded to the results of the first nurse. The sensitivity, specificity and best cutoff score for 3-MinNS were determined using the Receiver Operator Characteristics Curve. Results: The best cutoff score to identify all patients at risk of malnutrition using 3-MinNS was three, with sensitivity of 89% and specificity of 88%. This cutoff point also identified all (100%) severely malnourished patients. There was strong correlation between 3-MinNS and SGA (r=0.78, p<0.001). The agreement between two nurses conducting the 3-MinNS tool was 78.3%. Conclusion: 3-Minute Nutrition Screening is a valid and reliable tool for nurses to identify patients at risk of malnutrition.
Resumo:
Rail operators recognize a need to increase ridership in order to improve the economic viability of rail service, and to magnify the role that rail travel plays in making cities feel liveable. This study extends previous research that used cluster analysis with a small sample of rail passengers to identify five salient perspectives of rail access (Zuniga et al, 2013). In this project stage, we used correlation techniques to determine how those perspectives would resonate with two larger study populations, including a relatively homogeneous sample of university students in Brisbane, Australia and a diverse sample of rail passengers in Melbourne, Australia. Findings from Zuniga et al. (2013) described a complex typology of current passengers that was based on respondents’ subjective attitudes and perceptions rather than socio-demographic or travel behaviour characteristics commonly used for segmentation analysis. The typology included five qualitative perspectives of rail travel. Based on the transport accessibility literature, we expected to find that perspectives from that study emphasizing physical access to rail stations would be shared by current and potential rail passengers who live further from rail stations. Other perspectives might be shared among respondents who live nearby, since the relevance of distance would be diminished. The population living nearby would thus represent an important target group for increasing ridership, since making rail travel accessible to them does not require expansion of costly infrastructure such as new lines or stations. By measuring the prevalence of each perspective in a larger respondent pool, results from this study provide insight into the typical socio-demographic and travel behaviour characteristics that correspond to each perspective of intra-urban rail travel. In several instances, our quantitative findings reinforced Zuniga et al.’s (2013) qualitative descriptions of passenger types, further validating the original research. This work may directly inform rail operators’ approach to increasing ridership through marketing and improvements to service quality and station experience. Operators in other parts of Australia and internationally may also choose to replicate the study locally, to fine-tune understanding of diverse customer bases. Developing regional and international collaboration would provide additional opportunities to evaluate and benchmark service and station amenities as they address the various access dimensions.
Resumo:
Objectives To examine the effects on monotonous driving of normal sleep versus one night of sleep restriction in continuous positive airway pressure (CPAP) treated obstructive sleep apnoea (OSA) patients compared with age matched healthy controls. Methods Nineteen CPAP treated compliant male OSA patients (OSA-treated patients (OPs)), aged 50–75 years, and 20 healthy age-matched controls underwent both a normal night’s sleep and sleep restriction to 5 h (OPs remained on CPAP) in a counterbalanced design. All participants completed a 2 h afternoon monotonous drive in a realistic car simulator. Driving was monitored for sleepiness-related minor and major lane deviations, with ‘safe’ driving time being total time driven prior to first major lane deviation. EEGs were recorded continuously, and subjective sleepiness ratings were taken at regular intervals throughout the drive. Results After a normal night’s sleep, OPs and controls did not differ in terms of driving performance or in their ability to assess the levels of their own sleepiness, with both groups driving ‘safely’ for approximately 90 min. However, after sleep restriction, OPs had a significantly shorter (65 min) safe driving time and had to apply more compensatory effort to maintain their alertness compared with controls. They also underestimated the enhanced sleepiness. Nevertheless, apart from this caveat, there were generally close associations between subjective sleepiness, likelihood of a major lane deviation and EEG changes indicative of sleepiness. Conclusions With a normal night’s sleep, effectively treated older men with OSA drive as safely as healthy men of the same age. However, after restricted sleep, driving impairment is worse than that of controls. This suggests that, although successful CPAP treatment can alleviate potential detrimental effects of OSA on monotonous driving following normal sleep, these patients remain more vulnerable to sleep restriction.
Resumo:
Young men figure prominently in sleep-related road crashes. Non-driving studies show them to be particularly vulnerable to sleep loss, compared with older men. We assessed the effect of a normal night's sleep vs. prior sleep restricted to 5 h, in a counterbalanced design, on prolonged (2 h) afternoon simulated driving in 20 younger (av. 23 y) and 19 older (av. 67 y) healthy men. Driving was monitored for sleepiness related lane deviations, EEGs were recorded continuously and subjective ratings of sleepiness taken every 200 s. Following normal sleep there were no differences between groups for any measure. After sleep restriction younger drivers showed significantly more sleepiness-related deviations and greater 4–11 Hz EEG power, indicative of sleepiness. There was a near significant increase in subjective sleepiness. Correlations between the EEG and subjective measures were highly significant for both groups, indicating good self-insight into increasing sleepiness. We confirm the greater vulnerability of younger drivers to sleep loss under prolonged afternoon driving.
Resumo:
Purpose Obstructive sleep apnoea (OSA) patients effectively treated by and compliant with continuous positive air pressure (CPAP) occasionally miss a night’s treatment. The purpose of this study was to use a real car interactive driving simulator to assess the effects of such an occurrence on the next day’s driving, including the extent to which these drivers are aware of increased sleepiness. Methods Eleven long-term compliant CPAP-treated 50–75-year-old male OSA participants completed a 2-h afternoon, simulated, realistic monotonous drive in an instrumented car, twice, following one night: (1) normal sleep with CPAP and (2) nil CPAP. Drifting out of road lane (‘incidents’), subjective sleepiness every 200 s and continuous electroencephalogram (EEG) activities indicative of sleepiness and compensatory effort were monitored. Results Withdrawal of CPAP markedly increased sleep disturbance and led to significantly more incidents, a shorter ‘safe’ driving duration, increased alpha and theta EEG power and greater subjective sleepiness. However, increased EEG beta activity indicated that more compensatory effort was being applied. Importantly, under both conditions, there was a highly significant correlation between subjective and EEG measures of sleepiness, to the extent that participants were well aware of the effects of nil CPAP. Conclusions Patients should be aware that compliance with treatment every night is crucial for safe driving.
Resumo:
The appropriateness of applying drink driving legislation to motorcycle riding has been questioned as there may be fundamental differences in the effects of alcohol on driving and motorcycling. It has been suggested that alcohol may redirect riders’ focus from higher-order cognitive skills such as cornering, judgement and hazard perception, to more physical skills such as maintaining balance. To test this hypothesis, the effects of low doses of alcohol on balance ability were investigated in a laboratory setting. The static balance of twenty experienced and twenty novice riders was measured while they performed either no secondary task, a visual (search) task, or a cognitive (arithmetic) task following the administration of alcohol (0%, 0.02%, and 0.05% BAC). Subjective ratings of intoxication and balance impairment increased in a dose-dependent manner in both novice and experienced motorcycle riders, while a BAC of 0.05%, but not 0.02%, was associated with impairments in static balance ability. This balance impairment was exacerbated when riders performed a cognitive, but not a visual, secondary task. Likewise, 0.05% BAC was associated with impairments in novice and experienced riders’ performance of a cognitive, but not a visual, secondary task, suggesting that interactive processes underlie balance and cognitive task performance. There were no observed differences between novice vs. experienced riders on static balance and secondary task performance, either alone or in combination. Implications for road safety and future ‘drink riding’ policy considerations are discussed.
Resumo:
The appropriateness of applying drink driving legislation to motorcycle riding has been questioned as there may be fundamental differences in the effects of alcohol on these two activities. For example, while the distribution of blood alcohol content (BAC) levels among fatally injured male drivers compared to riders is similar, a greater proportion of motorcycle fatalities involve levels in the lower (0 to .10% BAC) range. Several psychomotor and higher-order cognitive skills underpinning riding performance appear to be significantly influenced by low levels of alcohol. For example, at low levels (.02 to .046% BAC), riders show significant increases in reaction time to hazardous stimuli, inattention to the riding task, performance errors such as leaving the roadway and a reduced ability to complete a timed course. It has been suggested that alcohol may redirect riders’ focus from higher-order cognitive skills to more physical skills such as maintaining balance. As part of a research program to investigate the potential benefits of introducing a zero, or reduced, BAC for all riders in Queensland regardless of their licence status, the effects of low doses of alcohol on balance ability were investigated in a laboratory setting. The static balance of ten experienced riders was measured while they performed either no secondary task, a visual search task, or a cognitive (arithmetic) task following the administration of alcohol (0; 0.02, and 0.05% BAC). Subjective ratings of intoxication and balance impairment increased in a dose-dependent manner; however, objective measures of static balance were negatively affected only at the .05% BAC dose. Performance on a concurrent secondary visual search task, but not a purely cognitive (arithmetic) task, improved postural stability across all BAC levels. Finally, the .05% BAC dose was associated with impaired performance on the cognitive (arithmetic) task, but not the visual search task, when participants were balancing, but neither task was impaired by alcohol when participants were standing on the floor. Implications for road safety and future ‘drink riding’ policy considerations are discussed.
Resumo:
Introduction Malnutrition is common among hospitalised patients, with poor follow-up of nutrition support post-discharge. Published studies on the efficacy of ambulatory nutrition support (ANS) for malnourished patients post-discharge are scarce. The aims of this study were to evaluate the rate of dietetics follow-up of malnourished patients post-discharge, before (2008) and after (2010) implementation of a new ANS service, and to evaluate nutritional outcomes post-implementation. Materials and Methods Consecutive samples of 261 (2008) and 163 (2010) adult inpatients referred to dietetics and assessed as malnourished using Subjective Global Assessment (SGA) were enrolled. All subjects received inpatient nutrition intervention and dietetic outpatient clinic follow-up appointments. For the 2010 cohort, ANS was initiated to provide telephone follow-up and home visits for patients who failed to attend the outpatient clinic. Subjective Global Assessment, body weight, quality of life (EQ-5D VAS) and handgrip strength were measured at baseline and five months post-discharge. Paired t-test was used to compare pre- and post-intervention results. Results In 2008, only 15% of patients returned for follow-up with a dietitian within four months post-discharge. After implementation of ANS in 2010, the follow-up rate was 100%. Mean weight improved from 44.0 ± 8.5kg to 46.3 ± 9.6kg, EQ-5D VAS from 61.2 ± 19.8 to 71.6 ± 17.4 and handgrip strength from 15.1 ± 7.1 kg force to 17.5 ± 8.5 kg force; p<0.001 for all. Seventy-four percent of patients improved in SGA score. Conclusion Ambulatory nutrition support resulted in significant improvements in follow-up rate, nutritional status and quality of life of malnourished patients post-discharge.
Resumo:
INTRODUCTION It is known that the vascular morphology and functionality are changed following closed soft tissue trauma (CSTT) [1], and bone fractures [2]. The disruption of blood vessels may lead to hypoxia and necrosis. Currently, most clinical methods for the diagnosis and monitoring of CSTT with or without bone fractures are primarily based on qualitative measures or practical experience, making the diagnosis subjective and inaccurate. There is evidence that CSTT and early vascular changes following the injury delay the soft tissue tissue and bone healing [3]. However, a precise qualitative and quantitative morphological assessment of vasculature changes after trauma is currently missing. In this research, we aim to establish a diagnostic framework to assess the 3D vascular morphological changes after standardized CSTT in a rat model qualitatively and quantitatively using contrast-enhanced micro-CT imaging. METHODS An impact device was used for the application of a controlled reproducible CSTT to the left thigh (Biceps Femoris) of anaesthetized male Wistar rats. After euthanizing the animals at 6 hours, 24 hours, 3 days, 7 days, or 14 days after trauma, CSTT was qualitatively evaluated by macroscopic visual observation of the skin and muscles. For visualization of the vasculature, the blood vessels of sacrificed rats were flushed with heparinised saline and then perfused with a radio-opaque contrast agent (Microfil, MV 122, Flowtech, USA) using an infusion pump. After allowing the contrast agent to polymerize overnight, both hind-limbs were dissected, and then the whole injured and contra-lateral control limbs were imaged using a micro-CT scanner (µCT 40, Scanco Medical, Switzerland) to evaluate the vascular morphological changes. Correlated biopsy samples were also taken from the CSTT region of both injured and control legs. The morphological parameters such as the vessel volume ratio (VV/TV), vessel diameter (V.D), spacing (V.Sp), number (V.N), connectivity (V.Conn) and the degree of anisotropy (DA) were then quantified by evaluating the scans of biopsy samples using the micro-CT imaging system. RESULTS AND DISCUSSION A qualitative evaluation of the CSTT has shown that the developed impact protocols were capable of producing a defined and reproducible injury within the region of interest (ROI), resulting in a large hematoma and moderate swelling in both lateral and medial sides of the injured legs. Also, the visualization of the vascular network using 3D images confirmed the ability to perfuse the large vessels and a majority of the microvasculature consistently (Figure 1). Quantification of the vascular morphology obtained from correlated biopsy samples has demonstrated that V.D and V.N and V.Sp were significantly higher in the injured legs 24 hours after impact in comparison with the control legs (p<0.05). The evaluation of the other time points is currently progressing. CONCLUSIONS The findings of this research will contribute to a better understanding of the changes to the vascular network architecture following traumatic injuries and during healing process. When interpreted in context of functional changes, such as tissue oxygenation, this will allow for objective diagnosis and monitoring of CSTT and serve as validation for future non-invasive clinical assessment modalities.
Resumo:
The assessment of choroidal thickness from optical coherence tomography (OCT) images of the human choroid is an important clinical and research task, since it provides valuable information regarding the eye’s normal anatomy and physiology, and changes associated with various eye diseases and the development of refractive error. Due to the time consuming and subjective nature of manual image analysis, there is a need for the development of reliable objective automated methods of image segmentation to derive choroidal thickness measures. However, the detection of the two boundaries which delineate the choroid is a complicated and challenging task, in particular the detection of the outer choroidal boundary, due to a number of issues including: (i) the vascular ocular tissue is non-uniform and rich in non-homogeneous features, and (ii) the boundary can have a low contrast. In this paper, an automatic segmentation technique based on graph-search theory is presented to segment the inner choroidal boundary (ICB) and the outer choroidal boundary (OCB) to obtain the choroid thickness profile from OCT images. Before the segmentation, the B-scan is pre-processed to enhance the two boundaries of interest and to minimize the artifacts produced by surrounding features. The algorithm to detect the ICB is based on a simple edge filter and a directional weighted map penalty, while the algorithm to detect the OCB is based on OCT image enhancement and a dual brightness probability gradient. The method was tested on a large data set of images from a pediatric (1083 B-scans) and an adult (90 B-scans) population, which were previously manually segmented by an experienced observer. The results demonstrate the proposed method provides robust detection of the boundaries of interest and is a useful tool to extract clinical data.