785 resultados para Subjective refraction
Resumo:
Young men figure prominently in sleep-related road crashes. Non-driving studies show them to be particularly vulnerable to sleep loss, compared with older men. We assessed the effect of a normal night's sleep vs. prior sleep restricted to 5 h, in a counterbalanced design, on prolonged (2 h) afternoon simulated driving in 20 younger (av. 23 y) and 19 older (av. 67 y) healthy men. Driving was monitored for sleepiness related lane deviations, EEGs were recorded continuously and subjective ratings of sleepiness taken every 200 s. Following normal sleep there were no differences between groups for any measure. After sleep restriction younger drivers showed significantly more sleepiness-related deviations and greater 4–11 Hz EEG power, indicative of sleepiness. There was a near significant increase in subjective sleepiness. Correlations between the EEG and subjective measures were highly significant for both groups, indicating good self-insight into increasing sleepiness. We confirm the greater vulnerability of younger drivers to sleep loss under prolonged afternoon driving.
Resumo:
Purpose Obstructive sleep apnoea (OSA) patients effectively treated by and compliant with continuous positive air pressure (CPAP) occasionally miss a night’s treatment. The purpose of this study was to use a real car interactive driving simulator to assess the effects of such an occurrence on the next day’s driving, including the extent to which these drivers are aware of increased sleepiness. Methods Eleven long-term compliant CPAP-treated 50–75-year-old male OSA participants completed a 2-h afternoon, simulated, realistic monotonous drive in an instrumented car, twice, following one night: (1) normal sleep with CPAP and (2) nil CPAP. Drifting out of road lane (‘incidents’), subjective sleepiness every 200 s and continuous electroencephalogram (EEG) activities indicative of sleepiness and compensatory effort were monitored. Results Withdrawal of CPAP markedly increased sleep disturbance and led to significantly more incidents, a shorter ‘safe’ driving duration, increased alpha and theta EEG power and greater subjective sleepiness. However, increased EEG beta activity indicated that more compensatory effort was being applied. Importantly, under both conditions, there was a highly significant correlation between subjective and EEG measures of sleepiness, to the extent that participants were well aware of the effects of nil CPAP. Conclusions Patients should be aware that compliance with treatment every night is crucial for safe driving.
Resumo:
The appropriateness of applying drink driving legislation to motorcycle riding has been questioned as there may be fundamental differences in the effects of alcohol on driving and motorcycling. It has been suggested that alcohol may redirect riders’ focus from higher-order cognitive skills such as cornering, judgement and hazard perception, to more physical skills such as maintaining balance. To test this hypothesis, the effects of low doses of alcohol on balance ability were investigated in a laboratory setting. The static balance of twenty experienced and twenty novice riders was measured while they performed either no secondary task, a visual (search) task, or a cognitive (arithmetic) task following the administration of alcohol (0%, 0.02%, and 0.05% BAC). Subjective ratings of intoxication and balance impairment increased in a dose-dependent manner in both novice and experienced motorcycle riders, while a BAC of 0.05%, but not 0.02%, was associated with impairments in static balance ability. This balance impairment was exacerbated when riders performed a cognitive, but not a visual, secondary task. Likewise, 0.05% BAC was associated with impairments in novice and experienced riders’ performance of a cognitive, but not a visual, secondary task, suggesting that interactive processes underlie balance and cognitive task performance. There were no observed differences between novice vs. experienced riders on static balance and secondary task performance, either alone or in combination. Implications for road safety and future ‘drink riding’ policy considerations are discussed.
Resumo:
The appropriateness of applying drink driving legislation to motorcycle riding has been questioned as there may be fundamental differences in the effects of alcohol on these two activities. For example, while the distribution of blood alcohol content (BAC) levels among fatally injured male drivers compared to riders is similar, a greater proportion of motorcycle fatalities involve levels in the lower (0 to .10% BAC) range. Several psychomotor and higher-order cognitive skills underpinning riding performance appear to be significantly influenced by low levels of alcohol. For example, at low levels (.02 to .046% BAC), riders show significant increases in reaction time to hazardous stimuli, inattention to the riding task, performance errors such as leaving the roadway and a reduced ability to complete a timed course. It has been suggested that alcohol may redirect riders’ focus from higher-order cognitive skills to more physical skills such as maintaining balance. As part of a research program to investigate the potential benefits of introducing a zero, or reduced, BAC for all riders in Queensland regardless of their licence status, the effects of low doses of alcohol on balance ability were investigated in a laboratory setting. The static balance of ten experienced riders was measured while they performed either no secondary task, a visual search task, or a cognitive (arithmetic) task following the administration of alcohol (0; 0.02, and 0.05% BAC). Subjective ratings of intoxication and balance impairment increased in a dose-dependent manner; however, objective measures of static balance were negatively affected only at the .05% BAC dose. Performance on a concurrent secondary visual search task, but not a purely cognitive (arithmetic) task, improved postural stability across all BAC levels. Finally, the .05% BAC dose was associated with impaired performance on the cognitive (arithmetic) task, but not the visual search task, when participants were balancing, but neither task was impaired by alcohol when participants were standing on the floor. Implications for road safety and future ‘drink riding’ policy considerations are discussed.
Resumo:
Introduction Malnutrition is common among hospitalised patients, with poor follow-up of nutrition support post-discharge. Published studies on the efficacy of ambulatory nutrition support (ANS) for malnourished patients post-discharge are scarce. The aims of this study were to evaluate the rate of dietetics follow-up of malnourished patients post-discharge, before (2008) and after (2010) implementation of a new ANS service, and to evaluate nutritional outcomes post-implementation. Materials and Methods Consecutive samples of 261 (2008) and 163 (2010) adult inpatients referred to dietetics and assessed as malnourished using Subjective Global Assessment (SGA) were enrolled. All subjects received inpatient nutrition intervention and dietetic outpatient clinic follow-up appointments. For the 2010 cohort, ANS was initiated to provide telephone follow-up and home visits for patients who failed to attend the outpatient clinic. Subjective Global Assessment, body weight, quality of life (EQ-5D VAS) and handgrip strength were measured at baseline and five months post-discharge. Paired t-test was used to compare pre- and post-intervention results. Results In 2008, only 15% of patients returned for follow-up with a dietitian within four months post-discharge. After implementation of ANS in 2010, the follow-up rate was 100%. Mean weight improved from 44.0 ± 8.5kg to 46.3 ± 9.6kg, EQ-5D VAS from 61.2 ± 19.8 to 71.6 ± 17.4 and handgrip strength from 15.1 ± 7.1 kg force to 17.5 ± 8.5 kg force; p<0.001 for all. Seventy-four percent of patients improved in SGA score. Conclusion Ambulatory nutrition support resulted in significant improvements in follow-up rate, nutritional status and quality of life of malnourished patients post-discharge.
Resumo:
INTRODUCTION It is known that the vascular morphology and functionality are changed following closed soft tissue trauma (CSTT) [1], and bone fractures [2]. The disruption of blood vessels may lead to hypoxia and necrosis. Currently, most clinical methods for the diagnosis and monitoring of CSTT with or without bone fractures are primarily based on qualitative measures or practical experience, making the diagnosis subjective and inaccurate. There is evidence that CSTT and early vascular changes following the injury delay the soft tissue tissue and bone healing [3]. However, a precise qualitative and quantitative morphological assessment of vasculature changes after trauma is currently missing. In this research, we aim to establish a diagnostic framework to assess the 3D vascular morphological changes after standardized CSTT in a rat model qualitatively and quantitatively using contrast-enhanced micro-CT imaging. METHODS An impact device was used for the application of a controlled reproducible CSTT to the left thigh (Biceps Femoris) of anaesthetized male Wistar rats. After euthanizing the animals at 6 hours, 24 hours, 3 days, 7 days, or 14 days after trauma, CSTT was qualitatively evaluated by macroscopic visual observation of the skin and muscles. For visualization of the vasculature, the blood vessels of sacrificed rats were flushed with heparinised saline and then perfused with a radio-opaque contrast agent (Microfil, MV 122, Flowtech, USA) using an infusion pump. After allowing the contrast agent to polymerize overnight, both hind-limbs were dissected, and then the whole injured and contra-lateral control limbs were imaged using a micro-CT scanner (µCT 40, Scanco Medical, Switzerland) to evaluate the vascular morphological changes. Correlated biopsy samples were also taken from the CSTT region of both injured and control legs. The morphological parameters such as the vessel volume ratio (VV/TV), vessel diameter (V.D), spacing (V.Sp), number (V.N), connectivity (V.Conn) and the degree of anisotropy (DA) were then quantified by evaluating the scans of biopsy samples using the micro-CT imaging system. RESULTS AND DISCUSSION A qualitative evaluation of the CSTT has shown that the developed impact protocols were capable of producing a defined and reproducible injury within the region of interest (ROI), resulting in a large hematoma and moderate swelling in both lateral and medial sides of the injured legs. Also, the visualization of the vascular network using 3D images confirmed the ability to perfuse the large vessels and a majority of the microvasculature consistently (Figure 1). Quantification of the vascular morphology obtained from correlated biopsy samples has demonstrated that V.D and V.N and V.Sp were significantly higher in the injured legs 24 hours after impact in comparison with the control legs (p<0.05). The evaluation of the other time points is currently progressing. CONCLUSIONS The findings of this research will contribute to a better understanding of the changes to the vascular network architecture following traumatic injuries and during healing process. When interpreted in context of functional changes, such as tissue oxygenation, this will allow for objective diagnosis and monitoring of CSTT and serve as validation for future non-invasive clinical assessment modalities.
Resumo:
The assessment of choroidal thickness from optical coherence tomography (OCT) images of the human choroid is an important clinical and research task, since it provides valuable information regarding the eye’s normal anatomy and physiology, and changes associated with various eye diseases and the development of refractive error. Due to the time consuming and subjective nature of manual image analysis, there is a need for the development of reliable objective automated methods of image segmentation to derive choroidal thickness measures. However, the detection of the two boundaries which delineate the choroid is a complicated and challenging task, in particular the detection of the outer choroidal boundary, due to a number of issues including: (i) the vascular ocular tissue is non-uniform and rich in non-homogeneous features, and (ii) the boundary can have a low contrast. In this paper, an automatic segmentation technique based on graph-search theory is presented to segment the inner choroidal boundary (ICB) and the outer choroidal boundary (OCB) to obtain the choroid thickness profile from OCT images. Before the segmentation, the B-scan is pre-processed to enhance the two boundaries of interest and to minimize the artifacts produced by surrounding features. The algorithm to detect the ICB is based on a simple edge filter and a directional weighted map penalty, while the algorithm to detect the OCB is based on OCT image enhancement and a dual brightness probability gradient. The method was tested on a large data set of images from a pediatric (1083 B-scans) and an adult (90 B-scans) population, which were previously manually segmented by an experienced observer. The results demonstrate the proposed method provides robust detection of the boundaries of interest and is a useful tool to extract clinical data.
Resumo:
Background Nutrition screening is usually administered by nurses. However, most studies on nutrition screening tools have not used nurses to validate the tools. The 3-Minute Nutrition Screening (3-MinNS) assesses weight loss, dietary intake and muscle wastage, with the composite score of each used to determine risk of malnutrition. The aim of the study was to determine the validity and reliability of 3-MinNS administered by nurses, who are the intended assessors. Methods In this cross sectional study, three ward-based nurses screened 121 patients aged 21 years and over using 3-MinNS in three wards within 24 hours of admission. A dietitian then assessed the patients’ nutritional status using Subjective Global Assessment within 48 hours of admission, whilst blinded to the results of the screening. To assess the reliability of 3-MinNS, 37 patients screened by the first nurse were re-screened by a second nurse within 24 hours, who was blinded to the results of the first nurse. The sensitivity, specificity and best cutoff score for 3-MinNS were determined using the Receiver Operator Characteristics Curve. Results The best cutoff score to identify all patients at risk of malnutrition using 3-MinNS was three, with sensitivity of 89% and specificity of 88%. This cutoff point also identified all (100%) severely malnourished patients. There was strong correlation between 3-MinNS and SGA (r=0.78, p<0.001). The agreement between two nurses conducting the 3-MinNS tool was 78.3%. Conclusion 3-Minute Nutrition Screening is a valid and reliable tool for nurses to identify patients at risk of malnutrition.
Resumo:
Background & aims The confounding effect of disease on the outcomes of malnutrition using diagnosis-related groups (DRG) has never been studied in a multidisciplinary setting. This study aims to determine the impact of malnutrition on hospitalisation outcomes, controlling for DRG. Methods Subjective Global Assessment was used to assess the nutritional status of 818 patients within 48 hours of admission. Prospective data were collected on cost of hospitalisation, length of stay (LOS), readmission and mortality up to 3 years post-discharged using National Death Register data. Mixed model analysis and conditional logistic regression matching by DRG were carried out to evaluate the association between nutritional status and outcomes, with the results adjusted for gender, age and race. Results Malnourished patients (29%) had longer hospital stays (6.9±7.3 days vs. 4.6±5.6 days, p<0.001) and were more likely to be readmitted within 15 days (adjusted relative risk = 1.9, 95%CI 1.1–3.2, p=0.025). Within a DRG, the mean difference between actual cost of hospitalisation and the average cost for malnourished patients was greater than well-nourished patients (p=0.014). Mortality was higher in malnourished patients at 1 year (34% vs. 4.1 %), 2 years (42.6% vs. 6.7%) and 3 years (48.5% vs. 9.9%); p<0.001 for all. Overall, malnutrition was a significant predictor of mortality (adjusted hazard ratio = 4.4, 95%CI 3.3-6.0, p<0.001). Conclusions Malnutrition was evident in up to one third of inpatients and led to poor hospitalisation outcomes, even after matching for DRG. Strategies to prevent and treat malnutrition in the hospital and post-discharge are needed.
Resumo:
Introduction Malnutrition is common among hospitalised patients, with poor follow-up of nutrition support post-discharge. Published studies on the efficacy of ambulatory nutrition support (ANS) for malnourished patients post-discharge are scarce. The aims of this study were to evaluate the rate of dietetics follow-up of malnourished patients post-discharge, before (2008) and after (2010) implementation of a new ANS service, and to evaluate nutritional outcomes post-implementation. Materials and Methods Consecutive samples of 261 (2008) and 163 (2010) adult inpatients referred to dietetics and assessed as malnourished using Subjective Global Assessment (SGA) were enrolled. All subjects received inpatient nutrition intervention and dietetic outpatient clinic follow-up appointments. For the 2010 cohort, ANS was initiated to provide telephone follow-up and home visits for patients who failed to attend the outpatient clinic. Subjective Global Assessment, body weight, quality of life (EQ-5D VAS) and handgrip strength were measured at baseline and five months post-discharge. Paired t-test was used to compare pre- and post-intervention results. Results In 2008, only 15% of patients returned for follow-up with a dietitian within four months post-discharge. After implementation of ANS in 2010, the follow-up rate was 100%. Mean weight improved from 44.0 ± 8.5kg to 46.3 ± 9.6kg, EQ-5D VAS from 61.2 ± 19.8 to 71.6 ± 17.4 and handgrip strength from 15.1 ± 7.1 kg force to 17.5 ± 8.5 kg force; p<0.001 for all. Seventy-four percent of patients improved in SGA score. Conclusion Ambulatory nutrition support resulted in significant improvements in follow-up rate, nutritional status and quality of life of malnourished patients post-discharge.
Resumo:
In contemporary game development circles the ‘game making jam’ has become an important rite of passage and baptism event, an exploration space and a central indie lifestyle affirmation and community event. Game jams have recently become a focus for design researchers interested in the creative process. In this paper we tell the story of an established local game jam and our various documentation and data collection methods. We present the beginnings of the current project, which seeks to map the creative teams and their process in the space of the challenge, and which aims to enable participants to be more than the objects of the data collection. A perceived issue is that typical documentation approaches are ‘about’ the event as opposed to ‘made by’ the participants and are thus both at odds with the spirit of the jam as a phenomenon and do not really access the rich playful potential of participant experience. In the data collection and visualisation projects described here, we focus on using collected data to re-include the participants in telling stories about their experiences of the event as a place-based experience. Our goal is to find a means to encourage production of ‘anecdata’ - data based on individual story telling that is subjective, malleable, and resists collection via formal mechanisms - and to enable mimesis, or active narrating, on the part of the participants. We present a concept design for data as game based on the logic of early medieval maps and we reflect on how we could enable participation in the data collection itself.
Resumo:
The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling. [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia
Resumo:
An updated version, this excellent text is a timely addition to the library of any nurse researching in oncology or other settings where individuals’ quality of life must be understood. Health-related quality of life should be a central aspect of studies concerned with health and illness. Indeed, considerable evidence has recently emerged in oncology and other research settings that selfreported quality of life is of great prognostic significance and may be the most reliable predictor of subsequent morbidity and mortality. From a nursing perspective, it is also gratifying to note that novel therapy and other oncology studies increasingly recognize the importance of understanding patients’ subjective experiences of an intervention over time and to ascertain whether patients perceive that a new intervention makes a difference to their quality of life and treatment outcomes. Measurements of quality of life are now routine in clinical trials of chemotherapy drugs and are often considered the prime outcome of interest in the cost/benefit analyses of these treatments. The authors have extensive experience in qualityof- life assessment in cancer clinical trials, where most of the pioneering work into quality of life has been conducted. That said, many of the health-related qualityof- life issues discussed are common to many illnesses, and researchers outside of cancer should find the book equally helpful.
Resumo:
Background Cancer-related malnutrition is associated with increased morbidity, poorer tolerance of treatment, decreased quality of life, increased hospital admissions, and increased health care costs (Isenring et al., 2013). This study’s aim was to determine whether a novel, automated screening system was a useful tool for nutrition screening when compared against a full nutrition assessment using the Patient-Generated Subjective Global Assessment (PG-SGA) tool. Methods A single site, observational, cross-sectional study was conducted in an outpatient oncology day care unit within a Queensland tertiary facility, with three hundred outpatients (51.7% male, mean age 58.6 ± 13.3 years). Eligibility criteria: ≥18 years, receiving anticancer treatment, able to provide written consent. Patients completed the Malnutrition Screening Tool (MST). Nutritional status was assessed using the PG-SGA. Data for the automated screening system was extracted from the pharmacy software program Charm. This included body mass index (BMI) and weight records dating back up to six months. Results The prevalence of malnutrition was 17%. Any weight loss over three to six weeks prior to the most recent weight record as identified by the automated screening system relative to malnutrition resulted in 56.52% sensitivity, 35.43% specificity, 13.68% positive predictive value, 81.82% negative predictive value. MST score 2 or greater was a stronger predictor of nutritional risk relative to PG-SGA classified malnutrition (70.59% sensitivity, 69.48% specificity, 32.14% positive predictive value, 92.02% negative predictive value). Conclusions Both the automated screening system and the MST fell short of the accepted professional standard for sensitivity (80%) or specificity (60%) when compared to the PG-SGA. However, although the MST remains a better predictor of malnutrition in this setting, uptake of this tool in the Oncology Day Care Unit remains challenging.
Resumo:
Purpose: To examine between eye differences in corneal higher order aberrations and topographical characteristics in a range of refractive error groups. Methods: One hundred and seventy subjects were recruited including; 50 emmetropic isometropes, 48 myopic isometropes (spherical equivalent anisometropia ≤ 0.75 D), 50 myopic anisometropes (spherical equivalent anisometropia ≥ 1.00 D) and 22 keratoconics. The corneal topography of each eye was captured using the E300 videokeratoscope (Medmont, Victoria, Australia) and analyzed using custom written software. All left eye data were rotated about the vertical midline to account for enantiomorphism. Corneal height data were used to calculate the corneal wavefront error using a ray tracing procedure and fit with Zernike polynomials (up to and including the eighth radial order). The wavefront was centred on the line of sight by using the pupil offset value from the pupil detection function in the videokeratoscope. Refractive power maps were analysed to assess corneal sphero-cylindrical power vectors. Differences between the more myopic (or more advanced eye for keratoconics) and the less myopic (advanced) eye were examined. Results: Over a 6 mm diameter, the cornea of the more myopic eye was significantly steeper (refractive power vector M) compared to the fellow eye in both anisometropes (0.10 ± 0.27 D steeper, p = 0.01) and keratoconics (2.54 ± 2.32 D steeper, p < 0.001) while no significant interocular difference was observed for isometropic emmetropes (-0.03 ± 0.32 D) or isometropic myopes (0.02 ± 0.30 D) (both p > 0.05). In keratoconic eyes, the between eye difference in corneal refractive power was greatest inferiorly (associated with cone location). Similarly, in myopic anisometropes, the more myopic eye displayed a central region of significant inferior corneal steepening (0.15 ± 0.42 D steeper) relative to the fellow eye (p = 0.01). Significant interocular differences in higher order aberrations were only observed in the keratoconic group for; vertical trefoil C(3,-3), horizontal coma C(3,1) secondary astigmatism along 45 C(4, -2) (p < 0.05) and vertical coma C(3,-1) (p < 0.001). The interocular difference in vertical pupil decentration (relative to the corneal vertex normal) increased with between eye asymmetry in refraction (isometropia 0.00 ± 0.09, anisometropia 0.03 ± 0.15 and keratoconus 0.08 ± 0.16 mm) as did the interocular difference in corneal vertical coma C (3,-1) (isometropia -0.006 ± 0.142, anisometropia -0.037 ± 0.195 and keratoconus -1.243 ± 0.936 μm) but only reached statistical significance for pair-wise comparisons between the isometropic and keratoconic groups. Conclusions: There is a high degree of corneal symmetry between the fellow eyes of myopic and emmetropic isometropes. Interocular differences in corneal topography and higher order aberrations are more apparent in myopic anisometropes and keratoconics due to regional (primarily inferior) differences in topography and between eye differences in vertical pupil decentration relative to the corneal vertex normal. Interocular asymmetries in corneal optics appear to be associated with anisometropic refractive development.