567 resultados para 001
Resumo:
It is possible to estimate the depth of focus (DOF) of the eye directly from wavefront measurements using various retinal image quality metrics (IQMs). In such methods, DOF is defined as the range of defocus error that degrades the retinal image quality calculated from IQMs to a certain level of the maximum value. Although different retinal image quality metrics are used, currently there have been two arbitrary threshold levels adopted, 50% and 80%. There has been limited study of the relationship between these threshold levels and the actual measured DOF. We measured the subjective DOF in a group of 17 normal subjects, and used through-focus augmented visual Strehl ratio based on optical transfer function (VSOTF) derived from their wavefront aberrations as the IQM. For each subject, a VSOTF threshold level was derived that would match the subjectively measured DOF. Significant correlation was found between the subject’s estimated threshold level and the HOA RMS (Pearson’s r=0.88, p<0.001). The linear correlation can be used to estimate the threshold level for each individual subject, subsequently leading to a method for estimating individual’s DOF from a single measurement of their wavefront aberrations.
Resumo:
Background Research involving incapacitated persons with dementia entails complex scientific, legal, and ethical issues, making traditional surveys of layperson views on the ethics of such research challenging. We therefore assessed the impact of democratic deliberation (DD), involving balanced, detailed education and peer deliberation, on the views of those responsible for persons with dementia. Methods One hundred and seventy-eight community-recruited caregivers or primary decision-makers for persons with dementia were randomly assigned to either an all-day DD session group or a control group. Educational materials used for the DD session were vetted for balance and accuracy by an interdisciplinary advisory panel. We assessed the acceptability of family-surrogate consent for dementia research (“surrogate-based research”) from a societal policy perspective as well as from the more personal perspectives of deciding for a loved one or for oneself (surrogate and self-perspectives), assessed at baseline, immediately post-DD session, and 1 month after DD date, for four research scenarios of varying risk-benefit profiles. Results At baseline, a majority in both the DD and control groups supported a policy of family consent for dementia research in all research scenarios. The support for a policy of family consent for surrogate-based research increased in the DD group, but not in the control group. The change in the DD group was maintained 1 month later. In the DD group, there were transient changes in attitudes from surrogate or self-perspectives. In the control group, there were no changes from baseline in attitude toward surrogate consent from any perspective. Conclusions Intensive, balanced, and accurate education, along with peer deliberation provided by democratic deliberation, led to a sustained increase in support for a societal policy of family consent in dementia research among those responsible for dementia patients.
Resumo:
Purpose. To investigate the effect of various presbyopic vision corrections on nighttime driving performance on a closed-road driving circuit. Methods. Participants were 11 presbyopes (mean age, 57.3 ± 5.8 years), with a mean best sphere distance refractive error of R+0.23±1.53 DS and L+0.20±1.50 DS, whose only experience of wearing presbyopic vision correction was reading spectacles. The study involved a repeated-measures design by which a participant's nighttime driving performance was assessed on a closed-road circuit while wearing each of four power-matched vision corrections. These included single-vision distance lenses (SV), progressive-addition spectacle lenses (PAL), monovision contact lenses (MV), and multifocal contact lenses (MTF CL) worn in a randomized order. Measures included low-contrast road hazard detection and avoidance, road sign and near target recognition, lane-keeping, driving time, and legibility distance for street signs. Eye movement data (fixation duration and number of fixations) were also recorded. Results. Street sign legibility distances were shorter when wearing MV and MTF CL than SV and PAL (P < 0.001), and participants drove more slowly with MTF CL than with PALs (P = 0.048). Wearing SV resulted in more errors (P < 0.001) and in more (P = 0.002) and longer (P < 0.001) fixations when responding to near targets. Fixation duration was also longer when viewing distant signs with MTF CL than with PAL (P = 0.031). Conclusions. Presbyopic vision corrections worn by naive, unadapted wearers affected nighttime driving. Overall, spectacle corrections (PAL and SV) performed well for distance driving tasks, but SV negatively affected viewing near dashboard targets. MTF CL resulted in the shortest legibility distance for street signs and longer fixation times.
Resumo:
Background: Factors that individually influence blood sugar control, health-related quality of life, and diabetes self-care behaviors have been widely investigated; however, most previous diabetes studies have not tested an integrated association between a series of factors and multiple health outcomes. ---------- Objectives: The purposes of this study are to identify risk factors and protective factors and to examine the impact of risk factors and protective factors on adaptive outcomes in people with type 2 diabetes.---------- Design: A descriptive correlational design was used to examine a theoretical model of risk factors, protective factors, and adaptive outcomes.---------- Settings: This study was conducted at the endocrine outpatient departments of three hospitals in Taiwan. Participants A convenience sample of 334 adults with type 2 diabetes aged 40 and over.---------- Methods: Data were collected by a self-reported questionnaire and physiological examination. Using the structural equation modeling technique, measurement and structural regression models were tested.---------- Results: Age and life events reflected the construct of risk factors. The construct of protective factors was explained by diabetes symptoms, coping strategy, and social support. The construct of adaptive outcomes comprised HbA1c, health-related quality of life, and self-care behaviors. Protective factors had a significant direct effect on adaptive outcomes (β = 0.68, p < 0.001); however, risk factors did not predict adaptive outcomes (β = − 0.48, p = 0.118).---------- Conclusions: Identifying and managing risk factors and protective factors are an integral part of diabetes care. This theoretical model provides a better understanding of how risk factors and protective factors work together to influence multiple adaptive outcomes in people living with type 2 diabetes.
Resumo:
Concentrations of ultrafine (<0.1µm) particles (UFPs) and PM2.5 (<2.5µm) were measured whilst commuting along a similar route by train, bus, ferry and automobile in Sydney, Australia. One trip on each transport mode was undertaken during both morning and evening peak hours throughout a working week, for a total of 40 trips. Analyses comprised one-way ANOVA to compare overall (i.e. all trips combined) geometric mean concentrations of both particle fractions measured across transport modes, and assessment of both the correlation between wind speed and individual trip means of UFPs and PM2.5, and the correlation between the two particle fractions. Overall geometric mean concentrations of UFPs and PM2.5 ranged from 2.8 (train) to 8.4 (bus) × 104 particles cm-3 and 22.6 (automobile) to 29.6 (bus) µg m-3, respectively, and a statistically significant difference (p <0.001) between modes was found for both particle fractions. Individual trip geometric mean concentrations were between 9.7 × 103 (train) and 2.2 × 105 (bus) particles cm-3 and 9.5 (train) to 78.7 (train) µg m-3. Estimated commuter exposures were variable, and the highest return trip mean PM2.5 exposure occurred in the ferry mode, whilst the highest UFP exposure occurred during bus trips. The correlation between fractions was generally poor, and in keeping with the duality of particle mass and number emissions in vehicle-dominated urban areas. Wind speed was negatively correlated with, and a generally poor determinant of, UFP and PM2.5 concentrations, suggesting a more significant role for other factors in determining commuter exposure.
Resumo:
Background/objectives The provision of the patient bed-bath is a fundamental nursing care activity yet few quantitative data and no qualitative data are available on registered nurses’ (RNs) clinical practice in this domain in the intensive care unit (ICU). The aim of this study was to describe ICU RNs current practice with respect to the timing, frequency and duration of the patient bed-bath and the cleansing and emollient agents used. Methods The study utilised a two-phase sequential explanatory mixed method design. Phase one used a questionnaire to survey RNs and phase two employed semi-structured focus group (FG) interviews with RNs. Data was collected over 28 days across four Australian metropolitan ICUs. Ethical approval was granted from the relevant hospital and university human research ethics committees. RNs were asked to complete a questionnaire following each episode of care (i.e. bed-bath) and then to attend one of three FG interviews: RNs with less than 2 years ICU experience; RNs with 2–5 years ICU experience; and RNs with greater than 5 years ICU experience. Results During the 28-day study period the four ICUs had 77.25 beds open. In phase one a total of 539 questionnaires were returned, representing 30.5% of episodes of patient bed-baths (based on 1767 bed occupancy and one bed-bath per patient per day). In 349 bed-bath episodes 54.7% patients were mechanically ventilated. The bed-bath was given between 02.00 and 06.00 h in 161 episodes (30%), took 15–30 min to complete (n = 195, 36.2%) and was completed within the last 8 h in 304 episodes (56.8%). Cleansing agents used were predominantly pH balanced soap or liquid soap and water (n = 379, 71%) in comparison to chlorhexidine impregnated sponges/cloths (n = 86, 16.1%) or other agents such as pre-packaged washcloths (n = 65, 12.2%). In 347 episodes (64.4%) emollients were not applied after the bed-bath. In phase two 12 FGs were conducted (three FGs at each ICU) with a total of 42 RN participants. Thematic analysis of FG transcripts across the three levels of RN ICU experience highlighted a transition of patient hygiene practice philosophy from shades of grey – falling in line for inexperienced clinicians to experienced clinicians concrete beliefs about patient bed-bath needs. Conclusions This study identified variation in process and products used in patient hygiene practices in four ICUs. Further study to improve patient outcomes is required to determine the appropriate timing of patient hygiene activities and cleansing agents used to improve skin integrity.
Resumo:
Visual impairment is an important contributing factor in falls among older adults, which is one of the leading causes of injury and injury-related death in this population. Visual impairment is also associated with greater disability among older adults, including poorer health-related quality of life, increased frailty and reduced postural stability. The majority of this evidence, however, is based on measures of central visual function, rather than peripheral visual function. As such, there is comparatively limited research on the associations between peripheral visual function, disability and falls, and even fewer studies involving older adults with specific diseases which affect peripheral visual function, the most common of which is glaucoma. Glaucoma is one of the leading causes of irreversible vision loss among older adults, affecting around 3 per cent of adults aged over 60 years. The condition is characterised by retinal nerve fibre loss, primarily affecting peripheral visual function. Importantly, the number of older adults with glaucomatous visual impairment is projected to increase as the ageing population grows. The first component of the thesis examined the cross-sectional association between glaucomatous visual impairment and health-related quality of life (Study 1a), functional status (Study 1b) and postural stability (Study 1c) among older adults. A cohort of 74 community-dwelling adults with glaucoma (mean age 74.2 ± 5.9 years) was recruited and completed a baseline assessment. A number of visual function measures was assessed, including central visual function (visual acuity and contrast sensitivity), motion sensitivity, retinal nerve fibre analysis and monocular and binocular visual field measures (monocular 24-2 and binocular integrated visual fields (IVF): IVF-60 and IVF-120). The analyses focused on the associations between the outcomes measures and severity and location of visual field loss, as this is the primary visual function affected by glaucoma. In Study 1a, we examined the association between visual field loss and health-related quality of life, measured by the Short Form 36-item Health Survey (SF-36). Greater binocular visual field loss, on both IVF measures, was associated with lower SF-36 physical component scores, adjusted for age and gender (Pearson's r =|0.32| to |0.36|, p<0.001). Furthermore, inferior visual field loss was more strongly associated with the SF-36 physical component than superior field loss. No association was found between visual field loss and SF-36 mental component scores. The association between visual field loss and functional status was examined in Study 1b. Functional status outcomes measures included a physical activity questionnaire (Physical Activity Scale for the Elderly, PASE), performance tests (six-minute walk test, timed up and go test and lower leg strength) and an overall functional status score. Significant, but weak, correlations were found between binocular visual field loss and PASE and overall functional status scores, adjusted for age and gender (Pearson's r =|0.24| to |0.33|, p<0.05). Greater inferior visual field loss, independent of superior visual field loss, was significantly associated with poorer physical performance results and lower overall functional status scores. In Study 1c, we examined the association between visual field loss and postural stability, using a swaymeter device which recorded body movement during four conditions: eyes open and closed, on a firm and foam surface. Greater binocular visual field loss was associated with increased postural sway, both on firm and foam surfaces, independent of age and gender (Pearson’s r =|0.44| to |0.46|, p <0.001). Furthermore, inferior visual field was a stronger contributor to postural stability, more so than the superior visual field, particularly on the foam condition with the eyes open. Greater visual field loss was associated with a reduction in the visual contribution to postural sway, which underlies the observed association with postural sway. The second component of the thesis examined the association between severity and location of visual field loss and falls during a 12-month longitudinal follow-up. The number of falls was assessed prospectively using monthly fall calendars. Of the 71 participants who successfully completed the follow up (mean age 73.9 ± 5.7 years), 44% reported one or more falls, and around 20% reported two or more falls. After adjusting for age and gender, every 10 points missed on the IVF-120 increased the rate of falls by 25% (rate ratio 1.25, 95% confidence interval 1.08 - 1.44) or every 5dB reduction in IVF-60 increased the rate of falls by 47% (rate ratio 1.47, 95% confidence interval 1.16 - 1.87). Inferior visual field loss was a significant predictor of falls, more so than superior field loss, highlighting the importance of the inferior visual field area in safe and efficient navigation. Further analyses indicated that postural stability, more so than functional status, may be a potential mediating factor in the relationship between visual field loss and falls. Future research is required to confirm this causal pathway. In addition, the use of topical beta-blocker medications was not associated with an increased rate of falls in this cohort, compared with the use of other topical anti-glaucoma medications. In summary, greater binocular visual field loss among older adults with glaucoma was associated with poorer health-related quality of life in the physical domain, reduced functional status, greater postural instability and higher rates of falling. When the location of visual field loss was examined, inferior visual field loss was consistently more strongly associated with these outcomes than superior visual field loss. Insights gained from this research improve our understanding of the association between glaucomatous visual field loss and disability, and its link with falls among older adults. The clinical implications of this research include the need to include visual field screening in falls risk assessments among older adults and to raise awareness of these findings to eye care practitioners and adults with glaucoma. The findings also assist in developing further research to examine strategies to reduce disability and prevent falls among older adults with glaucoma to promote healthy ageing and independence for these individuals.
Resumo:
In Queensland, Australia, the ultraviolet (UV) radiation levels are high (greater than UV Index 3) almost all year round. Although ambient UV is about three times higher in summer compared to winter, Queensland residents receive approximately equal personal doses of UV radiation within these seasons (Neale et al., 2010). Sun protection messages throughout the year are thus essential (Montague et al., 2001), need to reach all segments of the population, and should incorporate guidelines for maintenance of adequate vitamin D levels. Knowledge is an essential requirement to allow people to make health conscious decisions. Unprompted knowledge commonly requires a higher level of awareness or recency of acquisition compared to prompted recall (Waller et al., 2004). This paper thus reports further on the data from a 2008 population-based, cross-sectional telephone survey conducted in Queensland, Australia (2,001 participants; response rate=45%) (Youl et al., 2009). It was the aim of this research to establish the level of, and factors predicting, unprompted and prompted knowledge about health and vitamin D.
Resumo:
The relationship between participation in civic and political activities and membership of voluntary associations is now well established. What is less clear is the relative impacts of how much time people spend on group activities (associational intensity), and the number and type of groups that individuals are involved with (associational scope). Does it matter in terms of civic engagement, for example, whether one is a member of a quilting-circle or trade union? Does it matter whether association ‘membership’ is simply an annual payment or a major commitment of time and energy? In this article, we use a large survey to explore these questions empirically by focusing on the membership patterns and civic engagement practices of 4,001 citizens drawn from eight suburbs across Greater Melbourne, Australia. Our findings indicate that, while associational intensity is positively related to civic engagement, associational scope (the number of group memberships per person), is a more influential determinant of the level of civic and political participation. The results also suggest that while all forms of associationalism are important in terms of fostering greater levels of civic activity, not all forms have the same impact.
Resumo:
The study objective was to determine whether the ‘cardiac decompensation score’ could identify cardiac decompensation in a patient with existing cardiac compromise managed with intraaortic balloon counterpulsation (IABP). A one-group, posttest-only design was utilised to collect observations in 2003 from IABP recipients treated in the intensive care unit of a 450 bed Australian, government funded, public, cardiothoracic, tertiary referral hospital. Twenty-three consecutive IABP recipients were enrolled, four of whom died in ICU (17.4%). All non-survivors exhibited primarily rising scores over the observation period (p < 0.001) and had final scores of 25 or higher. In contrast, the maximum score obtained by a survivor at any time was 15. Regardless of survival, scores for the 23 participants were generally decreasing immediately following therapy escalation (p = 0.016). Further reflecting these changes in patient support, there was also a trend for scores to move from rising to falling at such treatment escalations (p = 0.024). This pilot study indicates the ‘cardiac decompensation score’ to accurately represent changes in heart function specific to an individual patient. Use of the score in conjunction with IABP may lead to earlier identification of changes occurring in a patient's cardiac function and thus facilitate improved IABP outcomes.
Resumo:
Objective: Diarrhoea in the enterally tube fed (ETF) intensive care unit (ICU) patient is a multifactorial problem. Diarrhoeal aetiologies in this patient cohort remain debatable; however, the consequences of diarrhoea have been well established and include electrolyte imbalance, dehydration, bacterial translocation, peri anal wound contamination and sleep deprivation. This study examined the incidence of diarrhoea and explored factors contributing to the development of diarrhoea in the ETF, critically ill, adult patient. ---------- Method: After institutional ethical review and approval, a single centre medical chart audit was undertaken to examine the incidence of diarrhoea in ETF, critically ill patients. Retrospective, non-probability sequential sampling was used of all emergency admission adult ICU patients who met the inclusion/exclusion criteria. ---------- Results: Fifty patients were audited. Faecal frequency, consistency and quantity were considered important criteria in defining ETF diarrhoea. The incidence of diarrhoea was 78%. Total patient diarrhoea days (r = 0.422; p = 0.02) and total diarrhoea frequency (r = 0.313; p = 0.027) increased when the patient was ETF for longer periods of time. Increased severity of illness, peripheral oxygen saturation (Sp02), glucose control, albumin and white cell count were found to be statistically significant factors for the development of diarrhoea. ---------- Conclusion: Diarrhoea in ETF critically ill patients is multi-factorial. The early identification of diarrhoea risk factors and the development of a diarrhoea risk management algorithm is recommended.
Resumo:
Objective: The aim of this literature review is to identify the role of probiotics in the management of enteral tube feeding (ETF) diarrhoea in critically ill patients.---------- Background: Diarrhoea is a common gastrointestinal problem seen in ETF patients. The incidence of diarrhoea in tube fed patients varies from 2% to 68% across all patients. Despite extensive investigation, the pathogenesis surrounding ETF diarrhoea remains unclear. Evidence to support probiotics to manage ETF diarrhoea in critically ill patients remains sparse.---------- Method: Literature on ETF diarrhoea and probiotics in critically ill, adult patients was reviewed from 1980 to 2010. The Cochrane Library, Pubmed, Science Direct, Medline and the Cumulative Index of Nursing and Allied Health Literature (CINAHL) electronic databases were searched using specific inclusion/exclusion criteria. Key search terms used were: enteral nutrition, diarrhoea, critical illness, probiotics, probiotic species and randomised clinical control trial (RCT).---------- Results: Four RCT papers were identified with two reporting full studies, one reporting a pilot RCT and one conference abstract reporting an RCT pilot study. A trend towards a reduction in diarrhoea incidence was observed in the probiotic groups. However, mortality associated with probiotic use in some severely and critically ill patients must caution the clinician against its use.---------- Conclusion: Evidence to support probiotic use in the management of ETF diarrhoea in critically ill patients remains unclear. This paper argues that probiotics should not be administered to critically ill patients until further research has been conducted to examine the causal relationship between probiotics and mortality, irrespective of the patient's disease state or projected prophylactic benefit of probiotic administration.
Resumo:
Human mesenchymal stem cells (hMSCs) possess great therapeutic potential for the treatment of bone disease and fracture non-union. Too often however, in vitro evidence alone of the interaction between hMSCs and the biomaterial of choice is used as justification for continued development of the material into the clinic. Clearly for hMSC-based regenerative medicine to be successful for the treatment of orthopaedic trauma, it is crucial to transplant hMSCs with a suitable carrier that facilitates their survival, optimal proliferation and osteogenic differentiation in vitro and in vivo. This motivated us to evaluate the use of polycaprolactone-20% tricalcium phosphate (PCL-TCP) scaffolds produced by fused deposition modeling for the delivery of hMSCs. When hMSCs were cultured on the PCL-TCP scaffolds and imaged by a combination of phase contrast, scanning electron and confocal laser microscopy, we observed five distinct stages of colonization over a 21-day period that were characterized by cell attachment, spreading, cellular bridging, the formation of a dense cellular mass and the accumulation of a mineralized extracellular matrix when induced with osteogenic stimulants. Having established that PCL-TCP scaffolds are able to support hMSC proliferation and osteogenic differentiation, we next tested the in vivo efficacy of hMSC-loaded PCL-TCP scaffolds in nude rat critical-sized femoral defects. We found that fluorescently labeled hMSCs survived in the defect site for up to 3 weeks post-transplantation. However, only 50% of the femoral defects treated with hMSCs responded favorably as determined by new bone volume. As such, we show that verification of hMSC viability and differentiation in vitro is not sufficient to predict the efficacy of transplanted stem cells to consistently promote bone formation in orthotopic defects in vivo.
Resumo:
This study conceptualizes, operationalises and validates the concept of Knowledge Management Competence as a four-phase multidimensional formative index. Employing survey data from 310 respondents representing 27 organizations using the SAP Enterprise System Financial module, the study results demonstrate a large, significant, positive relationship between Knowledge Management Competence and Enterprise Systems Success (ES-success, as conceived by Gable Sedera and Chan (2008)); suggesting important implications for practice. Strong evidence of the validity of Knowledge Management Competence as conceived and operationalised, too suggests potential from future research evaluating its relationships with possible antecedents and consequences.
Resumo:
Objective--To determine whether heart failure with preserved systolic function (HFPSF) has different natural history from left ventricular systolic dysfunction (LVSD). Design and setting--A retrospective analysis of 10 years of data (for patients admitted between 1 July 1994 and 30 June 2004, and with a study census date of 30 June 2005) routinely collected as part of clinical practice in a large tertiary referral hospital.Main outcome measures-- Sociodemographic characteristics, diagnostic features, comorbid conditions, pharmacotherapies, readmission rates and survival.Results--Of the 2961 patients admitted with chronic heart failure, 753 had echocardiograms available for this analysis. Of these, 189 (25%) had normal left ventricular size and systolic function. In comparison to patients with LVSD, those with HFPSF were more often female (62.4% v 38.5%; P = 0.001), had less social support, and were more likely to live in nursing homes (17.9% v 7.6%; P < 0.001), and had a greater prevalence of renal impairment (86.7% v 6.2%; P = 0.004), anaemia (34.3% v 6.3%; P = 0.013) and atrial fibrillation (51.3% v 47.1%; P = 0.008), but significantly less ischaemic heart disease (53.4% v 81.2%; P = 0.001). Patients with HFPSF were less likely to be prescribed an angiotensin-converting enzyme inhibitor (61.9% v 72.5%; P = 0.008); carvedilol was used more frequently in LVSD (1.5% v 8.8%; P < 0.001). Readmission rates were higher in the HFPSF group (median, 2 v 1.5 admissions; P = 0.032), particularly for malignancy (4.2% v 1.8%; P < 0.001) and anaemia (3.9% v 2.3%; P < 0.001). Both groups had the same poor survival rate (P = 0.912). Conclusions--Patients with HFPSF were predominantly older women with less social support and higher readmission rates for associated comorbid illnesses. We therefore propose that reduced survival in HFPSF may relate more to comorbid conditions than suboptimal cardiac management.