711 resultados para CI


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Associations between sitting-time and physical activity (PA) with depression are unclear. Purpose: To examine concurrent and prospective associations between both sitting-time and PA with prevalent depressive symptoms in mid-aged Australian women. Methods: Data were from 8,950 women, aged 50-55 years in 2001, who completed mail surveys in 2001, 2004, 2007 and 2010. Depressive symptoms were assessed using the Center for Epidemiological Studies Depression questionnaire. Associations between sitting-time (≤4, >4-7, >7 hrs/day) and PA (none, some, meeting guidelines) with depressive symptoms (symptoms/no symptoms) were examined in 2011 in concurrent and lagged mixed effect logistic modeling. Both main effects and interaction models were developed. Results: In main effects modeling, women who sat >7 hrs/day (OR 1.47, 95%CI 1.29-1.67) and women who did no PA (OR 1.99, 95%CI 1.75-2.27) were more likely to have depressive symptoms than women who sat ≤4 hrs/day and who met PA guidelines, respectively. In interaction modeling, the likelihood of depressive symptoms in women who sat >7 hrs/day and did no PA was triple that of women who sat ≤4 hrs/day and met PA guidelines (OR 2.96, 95%CI 2.37-3.69). In prospective main effects and interaction modeling, sitting-time was not associated with depressive symptoms, but women who did no PA were more likely than those who met PA guidelines to have future depressive symptoms (OR 1.26, 95%CI 1.08-1.47). Conclusions: Increasing PA to a level commensurate with PA guidelines can alleviate current depression symptoms and prevent future symptoms in mid-aged women. Reducing sitting-time may ameliorate current symptoms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vitamin D may have anti-skin cancer effects, but population-based evidence is lacking. We therefore assessed associations between vitamin D status and skin cancer risk in an Australian subtropical community. We analyzed prospective skin cancer incidence for 11 years following baseline assessment of serum 25(OH)-vitamin D in 1,191 adults (average age 54 years) and used multivariable logistic regression analysis to adjust risk estimates for age, sex, detailed assessments of usual time spent outdoors, phenotypic characteristics, and other possible confounders. Participants with serum 25(OH)-vitamin D concentrations above 75 nmol  l(-1) versus those below 75 nmol  l(-1) more often developed basal cell carcinoma (odds ratio (OR)=1.51 (95% confidence interval (CI): 1.10-2.07, P=0.01) and melanoma (OR=2.71 (95% CI: 0.98-7.48, P=0.05)). Squamous cell carcinoma incidence tended to be lower in persons with serum 25(OH)-vitamin D concentrations above 75 nmol  l(-1) compared with those below 75 nmol  l(-1) (OR=0.67 (95% CI: 0.44-1.03, P=0.07)). Vitamin D status was not associated with skin cancer incidence when participants were classified as above or below 50 nmol  l(-1) 25(OH)-vitamin D. Our findings do not indicate that the carcinogenicity of high sun exposure can be counteracted by high vitamin D status. High sun exposure is to be avoided as a means to achieve high vitamin D status.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The health impacts of exposure to ambient temperature have been drawing increasing attention from the environmental health research community, government, society, industries, and the public. Case-crossover and time series models are most commonly used to examine the effects of ambient temperature on mortality. However, some key methodological issues remain to be addressed. For example, few studies have used spatiotemporal models to assess the effects of spatial temperatures on mortality. Few studies have used a case-crossover design to examine the delayed (distributed lag) and non-linear relationship between temperature and mortality. Also, little evidence is available on the effects of temperature changes on mortality, and on differences in heat-related mortality over time. This thesis aimed to address the following research questions: 1. How to combine case-crossover design and distributed lag non-linear models? 2. Is there any significant difference in effect estimates between time series and spatiotemporal models? 3. How to assess the effects of temperature changes between neighbouring days on mortality? 4. Is there any change in temperature effects on mortality over time? To combine the case-crossover design and distributed lag non-linear model, datasets including deaths, and weather conditions (minimum temperature, mean temperature, maximum temperature, and relative humidity), and air pollution were acquired from Tianjin China, for the years 2005 to 2007. I demonstrated how to combine the case-crossover design with a distributed lag non-linear model. This allows the case-crossover design to estimate the non-linear and delayed effects of temperature whilst controlling for seasonality. There was consistent U-shaped relationship between temperature and mortality. Cold effects were delayed by 3 days, and persisted for 10 days. Hot effects were acute and lasted for three days, and were followed by mortality displacement for non-accidental, cardiopulmonary, and cardiovascular deaths. Mean temperature was a better predictor of mortality (based on model fit) than maximum or minimum temperature. It is still unclear whether spatiotemporal models using spatial temperature exposure produce better estimates of mortality risk compared with time series models that use a single site’s temperature or averaged temperature from a network of sites. Daily mortality data were obtained from 163 locations across Brisbane city, Australia from 2000 to 2004. Ordinary kriging was used to interpolate spatial temperatures across the city based on 19 monitoring sites. A spatiotemporal model was used to examine the impact of spatial temperature on mortality. A time series model was used to assess the effects of single site’s temperature, and averaged temperature from 3 monitoring sites on mortality. Squared Pearson scaled residuals were used to check the model fit. The results of this study show that even though spatiotemporal models gave a better model fit than time series models, spatiotemporal and time series models gave similar effect estimates. Time series analyses using temperature recorded from a single monitoring site or average temperature of multiple sites were equally good at estimating the association between temperature and mortality as compared with a spatiotemporal model. A time series Poisson regression model was used to estimate the association between temperature change and mortality in summer in Brisbane, Australia during 1996–2004 and Los Angeles, United States during 1987–2000. Temperature change was calculated by the current day's mean temperature minus the previous day's mean. In Brisbane, a drop of more than 3 �C in temperature between days was associated with relative risks (RRs) of 1.16 (95% confidence interval (CI): 1.02, 1.31) for non-external mortality (NEM), 1.19 (95% CI: 1.00, 1.41) for NEM in females, and 1.44 (95% CI: 1.10, 1.89) for NEM aged 65.74 years. An increase of more than 3 �C was associated with RRs of 1.35 (95% CI: 1.03, 1.77) for cardiovascular mortality and 1.67 (95% CI: 1.15, 2.43) for people aged < 65 years. In Los Angeles, only a drop of more than 3 �C was significantly associated with RRs of 1.13 (95% CI: 1.05, 1.22) for total NEM, 1.25 (95% CI: 1.13, 1.39) for cardiovascular mortality, and 1.25 (95% CI: 1.14, 1.39) for people aged . 75 years. In both cities, there were joint effects of temperature change and mean temperature on NEM. A change in temperature of more than 3 �C, whether positive or negative, has an adverse impact on mortality even after controlling for mean temperature. I examined the variation in the effects of high temperatures on elderly mortality (age . 75 years) by year, city and region for 83 large US cities between 1987 and 2000. High temperature days were defined as two or more consecutive days with temperatures above the 90th percentile for each city during each warm season (May 1 to September 30). The mortality risk for high temperatures was decomposed into: a "main effect" due to high temperatures using a distributed lag non-linear function, and an "added effect" due to consecutive high temperature days. I pooled yearly effects across regions and overall effects at both regional and national levels. The effects of high temperature (both main and added effects) on elderly mortality varied greatly by year, city and region. The years with higher heat-related mortality were often followed by those with relatively lower mortality. Understanding this variability in the effects of high temperatures is important for the development of heat-warning systems. In conclusion, this thesis makes contribution in several aspects. Case-crossover design was combined with distribute lag non-linear model to assess the effects of temperature on mortality in Tianjin. This makes the case-crossover design flexibly estimate the non-linear and delayed effects of temperature. Both extreme cold and high temperatures increased the risk of mortality in Tianjin. Time series model using single site’s temperature or averaged temperature from some sites can be used to examine the effects of temperature on mortality. Temperature change (no matter significant temperature drop or great temperature increase) increases the risk of mortality. The high temperature effect on mortality is highly variable from year to year.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monitoring foodservice satisfaction is a risk management strategy for malnutrition in the acute care sector, as low satisfaction may be associated with poor intake. This study aimed to investigate the relationship between age and foodservice satisfaction in the private acute care setting. Patient satisfaction was assessed using a validated tool, the Acute Care Hospital Foodservice Patient Satisfaction Questionnaire for data collected 2008–2010 (n = 779) at a private hospital, Brisbane. Age was grouped into three categories; <50 years, 51–70 years and >70 years. Fisher’s exact test assessed independence of categorical responses and age group; ANOVA or Kruskal–Wallis test was used for continuous variables. Dichotomised responses were analysed using logistic regression and odds ratios (95% confidence interval, p < 0.05). Overall foodservice satisfaction (5 point scale) was high (≥4 out of 5) and was independent of age group (p = 0.377). There was an increasing trend with age in mean satisfaction scores for individual dimensions of foodservice; food quality (p < 0.001), meal service quality (p < 0.001), staff service issues (p < 0.001) and physical environment (p < 0.001). A preference for being able to choose different sized meals (59.8% > 70 years vs 40.6% ≤50 years; p < 0.001) and response to ‘the foods are just the right temperature’ (55.3% >70 years vs 35.9% ≤50 years; p < 0.001) was dependent on age. For the food quality dimension, based on dichotomised responses (satisfied or not), the odds of satisfaction was higher for >70 years (OR = 5.0, 95% CI: 1.8–13.8; <50 years referent). These results suggest that dimensions of foodservice satisfaction are associated with age and can assist foodservices to meet varying generational expectations of clients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: Several new types of contraception became available in Australia over the last twelve years (the implant in 2001, progestogen intra-uterine device (IUD) in 2003, and vaginal contraceptive ring in 2007). Most methods of contraception require access to health services. Permanent sterilisation and the insertion of an implant or IUD involve a surgical procedure. Access to health professionals providing these specialised services may be more difficult in rural areas. This paper examines uptake of permanent or long-acting reversible contraception (LARCs) among Australian women in rural areas compared to women in urban areas. Method: Participants in the Australian Longitudinal Study on Women's Health born in 1973-78 reported on their contraceptive use at three surveys: 2003, 2006 and 2009. Contraceptive methods included permanent sterilisation (tubal ligation, vasectomy), non-daily or LARC methods (implant, IUD, injection, vaginal ring), and other methods including daily, barrier or "natural" methods (oral contraceptive pills, condoms, withdrawal, safe period). Sociodemographic, reproductive history and health service use factors associated with using permanent, LARC or other methods were examined using a multivariable logistic regression analysis. Results: Of 9,081 women aged 25-30 in 2003, 3% used permanent methods and 4% used LARCs. Six years later in 2009, of 8,200 women (aged 31-36), 11% used permanent methods and 9% used LARCs. The fully adjusted parsimonious regression model showed that the likelihood of a woman using LARCs and permanent methods increased with number of children. Women whose youngest child was school-age were more likely to use LARCs (OR=1.83, 95%CI 1.43-2.33) or permanent methods (OR=4.39, 95%CI 3.54-5.46) compared to women with pre-school children. Compared to women living in major cities, women in inner regional areas were more likely to use LARCs (OR=1.26, 95%CI 1.03-1.55) or permanent methods (OR=1.43, 95%CI 1.17-1.76). Women living in outer regional and remote areas were more likely than women living in cities to use LARCs (OR=1.65, 95%CI 1.31-2.08) or permanent methods (OR=1.69, 95%CI 1.43-2.14). Women with poorer access to GPs were more likely to use permanent methods (OR=1.27, 95%CI 1.07-1.52). Conclusions: Location of residence and access to health services are important factors in women's choices about long-acting contraception in addition to the number and age of their children. There is a low level of uptake of non-daily, long-acting methods of contraception among Australian women in their mid-thirties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To estimate the time spent by the researchers for preparing grant proposals, and to examine whether spending more time increase the chances of success. Design: Observational study. Setting: The National Health and Medical Research Council (NHMRC) of Australia. Participants: Researchers who submitted one or more NHMRC Project Grant proposals in March 2012. Main outcome measures: Total researcher time spent preparing proposals; funding success as predicted by the time spent. Results: The NHMRC received 3727 proposals of which 3570 were reviewed and 731 (21%) were funded. Among our 285 participants who submitted 632 proposals, 21% were successful. Preparing a new proposal took an average of 38 working days of researcher time and a resubmitted proposal took 28 working days, an overall average of 34 days per proposal. An estimated 550 working years of researchers' time (95% CI 513 to 589) was spent preparing the 3727 proposals, which translates into annual salary costs of AU$66 million. More time spent preparing a proposal did not increase the chances of success for the lead researcher (prevalence ratio (PR) of success for 10 day increase=0.91, 95% credible interval 0.78 to 1.04) or other researchers (PR=0.89, 95% CI 0.67 to 1.17). Conclusions: Considerable time is spent preparing NHMRC Project Grant proposals. As success rates are historically 20–25%, much of this time has no immediate benefit to either the researcher or society, and there are large opportunity costs in lost research output. The application process could be shortened so that only information relevant for peer review, not administration, is collected. This would have little impact on the quality of peer review and the time saved could be reinvested into research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To determine the frequency and nature of intern underperformance as documented on in-training assessment forms. Methods: A retrospective review of intern assessment forms from a 2 year period (2009–2010) was conducted at a tertiary referral hospital in Brisbane, Queensland. The frequency of interns assessed as ‘requiring substantial assistance’ and/or ‘requires further development’ on mid- or end-of-term assessment forms was determined. Forms were analysed by the clinical rotation, time of year and domain(s) of clinical practice in which underperformance was documented. Results: During 2009 and 2010 the overall documented incidence of intern underperformance was 2.4% (95% CI 1.5–3.9%). Clinical rotation in emergency medicine detected significantly more underperformance compared with other rotations (P < 0.01). Interns predominantly had difficulty with ‘clinical judgment and decision-making skills’, ‘time management skills’ and ‘teamwork and colleagues’ (62.5%, 55% and 32.5% of underperforming assessments, respectively). Time of the year did not affect frequency of underperformance. A proportion of 13.4% (95% CI 9.2–19.0%) of interns working at the institution over the study period received at least one assessment in which underperformance was documented. Seventy-six per cent of those interns who had underperformance identified by mid-term assessment successfully completed the term following remediation. Conclusion: The prevalence of underperformance among interns is low, although higher than previously suggested. Emergency medicine detects relatively more interns in difficulty than other rotations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The Vulnerable Elders Survey-13 (VES-13) is increasingly used to screen for older patients who can proceed to intensive chemotherapy without further comprehensive assessment. This study compared the VES-13 determination of fitness for treatment with the oncologist's assessments of fitness. Method: Sample: Consecutive series of solid tumour patients ≥65 years (n=175; M=72; range=65-86) from an Australian cancer centre. Patients were screened with the VES-13 before proceeding to usual treatment. Blinded to screening, oncologists concurrently predicted patient fitness for chemotherapy. A sample of 175 can detect, with 90% power, kappa coefficients of agreement between VES-13 and oncologists’ assessments >0.90 ("almost perfect agreement"). Separate backward stepwise logistic regression analyses assessed potential predictors of VES-13 and oncologists’ ratings of fitness. Results: Kappa coefficient for agreement between VES-13 and oncologists’ ratings of fitness was 0.41 (p<0.001). VES-13 and oncologists’ assessments agreed in 71% of ratings. VES-13 sensitivity = 83.3%; specificity = 57%; positive predictive value = 69%; negative predictive value = 75%. Logistic regression modelling indicated that the odds of being vulnerable to chemotherapy (VES-13) increased with increasing depression (OR=1.42; 95% CI: 1.18, 1.71) and decreased with increased functional independence assessed on the Bartel Index (OR=0.82; CI: 0.74, 0.92) and Lawton instrumental activities of daily living (OR=0.44; CI: 0.30, 0.65); RSquare=.65. Similarly, the odds of a patient being vulnerable to chemotherapy, when assessed by physicians, increased with increasing age (OR=1.15; CI: 1.07, 1.23) and depression (OR=1.23; CI: 1.06, 1.43), and decreased with increasing functional independence (OR=0.91; CI: 0.85, 0.98); RSquare=.32. Conclusions: Our data indicate moderate agreement between VES-13 and clinician assessments of patients’ fitness for chemotherapy. Current ‘one-step’ screening processes to determine fitness have limits. Nonetheless, screening tools do have the potential for modification and enhanced predictive properties in cancer care by adding relevant items, thus enabling fit patients to be immediately referred for chemotherapy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Surveillance programs and research for acute respiratory infections in remote Australian communities are complicated by difficulties in the storage and transport of frozen samples to urban laboratories for testing. This study assessed the sensitivity of a simple method for transporting nasal swabs from a remote setting for bacterial polymerase chain reaction (PCR) testing. Methods We sampled every individual who presented to a remote community clinic over a three week period in August at a time of low influenza and no respiratory syncytial virus activity. Two anterior nasal swabs were collected from each participant. The left nare specimen was mailed to the laboratory via routine postal services. The right nare specimen was transported frozen. Testing for six bacterial species was undertaken using real-time PCR. Results One hundred and forty participants were enrolled who contributed 150 study visits and paired specimens for testing. Respiratory illnesses accounted for 10% of the reasons for presentation. Bacteria were identified in 117 (78%) presentations for 110 (79.4%) individuals; Streptococcus pneumoniae and Haemophilus influenzae were the most common (each identified in 58% of episodes). The overall sensitivity for any bacterium detected in mailed specimens was 82.2% (95% CI 73.6, 88.1) compared to 94.8% (95% CI 89.4, 98.1) for frozen specimens. The sensitivity of the two methods varied by species identified. Conclusion The mailing of unfrozen nasal specimens from remote communities appears to influence the utility of the specimen for bacterial studies, with a loss in sensitivity for the detection of any species overall. Further studies are needed to confirm our finding and to investigate the possible mechanisms of effect. Clinical trial registration Australia and New Zealand Clinical Trials Registry Number: ACTRN12609001006235. Keywords: Respiratory bacteria; RT-PCR; Specimen transport; Laboratory methods

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Epidemiologic research has demonstrated that cutaneous markers of photo-damage are associated with risk of basal cell carcinoma (BCC). However there has been no previous attempt to calculate pooled risk estimates. METHODS: We conducted a systematic review and meta-analysis after extracting relevant studies published up to January 2013 from five electronic databases. Eligible studies were those that permitted quantitative assessment of the association between histologically-confirmed BCC and actinic keratoses, solar elastosis, solar lentigines, or telangiectasia. RESULTS: Seven eligible studies were identified and summary odds ratios (OR) were calculated using both random and quality effects models. Having more than ten actinic keratoses was most strongly associated with BCC, conferring up to a 5-fold increase in risk (OR: 4.97; 95% CI: 3.26, 7.58). Other factors, including solar elastosis, solar lentigines, and telangiectasia had weaker but positive associations with BCC with ORs around 1.5. CONCLUSIONS: Markers of chronic photo-damage are positively associated with BCC. The presence of actinic keratoses was the most strongly associated with BCC of the markers examined. IMPACT: This work highlights the relatively modest association between markers of chronic ultraviolet exposure and BCC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Remote monitoring for heart failure has been evaluated in numerous systematic reviews. The aim of this meta-review was to appraise their quality and synthesise results. We electronically searched online databases, performed a forward citation search and hand-searched bibliographies. Systematic reviews of remote monitoring interventions that were used for surveillance of heart failure patients were included. Seven (41%) systematic reviews pooled results for meta-analysis. Eight (47%) considered all non-invasive remote monitoring strategies. Five (29%) focused on telemonitoring. Four (24%) included both non-invasive and invasive technologies. According to AMSTAR criteria, ten (58%) systematic reviews were of poor methodological quality. In high quality reviews, the relative risk of mortality in patients who received remote monitoring ranged from 0.53 (95% CI=0.29-0.96) to 0.88 (95% CI=0.76-1.01). High quality reviews also reported that remote monitoring reduced the relative risk of all-cause (0.52; 95% CI=0.28-0.96 to 0.96; 95% CI=0.90–1.03) and heart failure-related hospitalizations (0.72; 95% CI=0.64–0.81 to RR 0.79; 95% CI=0.67-0.94) and, as a consequence, healthcare costs. As the high quality reviews reported that remote monitoring reduced hospitalizations, mortality and healthcare costs, research efforts should now be directed towards optimising these interventions in preparation for more widespread implementation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Side effects of the medications used for procedural sedation and analgesia in the cardiac catheterisation laboratory are known to cause impaired respiratory function. Impaired respiratory function poses considerable risk to patient safety as it can lead to inadequate oxygenation. Having knowledge about the conditions that predict impaired respiratory function prior to the procedure would enable nurses to identify at-risk patients and selectively implement intensive respiratory monitoring. This would reduce the possibility of inadequate oxygenation occurring. Aim: To identify pre-procedure risk factors for impaired respiratory function during nurse-administered procedural sedation and analgesia in the cardiac catheterisation laboratory. Design: Retrospective matched case–control. Methods: 21 cases of impaired respiratory function were identified and matched to 113 controls from a consecutive cohort of patients over 18 years of age. Conditional logistic regression was used to identify risk factors for impaired respiratory function. Results: With each additional indicator of acute illness, case patients were nearly two times more likely than their controls to experience impaired respiratory function (OR 1.78; 95% CI 1.19–2.67; p = 0.005). Indicators of acute illness included emergency admission, being transferred from a critical care unit for the procedure or requiring respiratory or haemodynamic support in the lead up to the procedure. Conclusion: Several factors that predict the likelihood of impaired respiratory function were identified. The results from this study could be used to inform prospective studies investigating the effectiveness of interventions for impaired respiratory function during nurse-administered procedural sedation and analgesia in the cardiac catheterisation laboratory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Procedural sedation and analgesia (PSA) administered by nurses in the cardiac catheterisation laboratory (CCL) is unlikely to yield serious complications. However, the safety of this practice is dependent on timely identification and treatment of depressed respiratory function. Aim: Describe respiratory monitoring in the CCL. Methods: Retrospective medical record audit of adult patients who underwent a procedure in the CCLs of one private hospital in Brisbane during May and June 2010. An electronic database was used to identify subjects and an audit tool ensured data collection was standardised. Results: Nurses administered PSA during 172/473 (37%) procedures including coronary angiographies, percutaneous coronary interventions, electrophysiology studies, radiofrequency ablations, cardiac pacemakers, implantable cardioverter defibrillators, temporary pacing leads and peripheral vascular interventions. Oxygen saturations were recorded during 160/172 (23%) procedures, respiration rate was recorded during 17/172 (10%) procedures, use of oxygen supplementation was recorded during 40/172 (23%) procedures and 13/172 (7.5%; 95% CI=3.59–11.41%) patients experienced oxygen desaturation. Conclusion: Although oxygen saturation was routinely documented, nurses did not regularly record respiration observations. It is likely that surgical draping and the requirement to minimise radiation exposure interfered with nurses’ ability to observe respiration. Capnography could overcome these barriers to respiration assessment as its accurate measurement of exhaled carbon dioxide coupled with the easily interpretable waveform output it produces, which displays a breath-by-breath account of ventilation, enables identification of respiratory depression in real-time. Results of this audit emphasise the need to ascertain the clinical benefits associated with using capnography to assess ventilation during PSA in the CCL.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cardiac catheterisation laboratory (CCL) is a specialised medical radiology facility where both chronic-stable and life-threatening cardiovascular illness is evaluated and treated. Although there are many potential sources of discomfort and distress associated with procedures performed in the CCL, a general anaesthetic is not usually required. For this reason, an anaesthetist is not routinely assigned to the CCL. Instead, to manage pain, discomfort and anxiety during the procedure, nurses administer a combination of sedative and analgesic medications according to direction from the cardiologist performing the procedure. This practice is referred to as nurse-administered procedural sedation and analgesia (PSA). While anecdotal evidence suggested that nurse-administered PSA was commonly used in the CCL, it was clear from the limited information available that current nurse-led PSA administration and monitoring practices varied and that there was contention around some aspects of practice including the type of medications that were suitable to be used and the depth of sedation that could be safely induced without an anaesthetist present. The overall aim of the program of research presented in this thesis was to establish an evidence base for nurse-led sedation practices in the CCL context. A sequential mixed methods design was used over three phases. The objective of the first phase was to appraise the existing evidence for nurse-administered PSA in the CCL. Two studies were conducted. The first study was an integrative review of empirical research studies and clinical practice guidelines focused on nurse-administered PSA in the CCL as well as in other similar procedural settings. This was the first review to systematically appraise the available evidence supporting the use of nurse-administered PSA in the CCL. A major finding was that, overall, nurse-administered PSA in the CCL was generally deemed to be safe. However, it was concluded from the analysis of the studies and the guidelines that were included in the review, that the management of sedation in the CCL was impacted by a variety of contextual factors including local hospital policy, workforce constraints and cardiologists’ preferences for the type of sedation used. The second study in the first phase was conducted to identify a sedation scale that could be used to monitor level of sedation during nurse-administered PSA in the CCL. It involved a structured literature review and psychometric analysis of scale properties. However, only one scale was found that was developed specifically for the CCL, which had not undergone psychometric testing. Several weaknesses were identified in its item structure. Other sedation scales that were identified were developed for the ICU. Although these scales have demonstrated validity and reliability in the ICU, weaknesses in their item structure precluded their use in the CCL. As findings indicated that no existing sedation scale should be applied to practice in the CCL, recommendations for the development and psychometric testing of a new sedation scale were developed. The objective of the second phase of the program of research was to explore current practice. Three studies were conducted in this phase using both quantitative and qualitative research methods. The first was a qualitative explorative study of nurses’ perceptions of the issues and challenges associated with nurse-administered PSA in the CCL. Major themes emerged from analysis of the qualitative data regarding the lack of access to anaesthetists, the limitations of sedative medications, the barriers to effective patient monitoring and the impact that the increasing complexity of procedures has on patients' sedation requirements. The second study in Phase Two was a cross-sectional survey of nurse-administered PSA practice in Australian and New Zealand CCLs. This was the first study to quantify the frequency that nurse-administered PSA was used in the CCL setting and to characterise associated nursing practices. It was found that nearly all CCLs utilise nurse-administered PSA (94%). Of note, by characterising nurse-administered PSA in Australian and New Zealand CCLs, several strategies to improve practice, such as setting up protocols for patient monitoring and establishing comprehensive PSA education for CCL nurses, were identified. The third study in Phase Two was a matched case-control study of risk factors for impaired respiratory function during nurse-administered PSA in the CCL setting. Patients with acute illness were found to be nearly twice as likely to experience impaired respiratory function during nurse-administered PSA (OR=1.78; 95%CI=1.19-2.67; p=0.005). These significant findings can now be used to inform prospective studies investigating the effectiveness of interventions for impaired respiratory function during nurse-administered PSA in the CCL. The objective of the third and final phase of the program of research was to develop recommendations for practice. To achieve this objective, a synthesis of findings from the previous phases of the program of research informed a modified Delphi study, which was conducted to develop a set of clinical practice guidelines for nurse-administered PSA in the CCL. The clinical practice guidelines that were developed set current best practice standards for pre-procedural patient assessment and risk screening practices as well as the intra and post-procedural patient monitoring practices that nurses who administer PSA in the CCL should undertake in order to deliver safe, evidence-based and consistent care to the many patients who undergo procedures in this setting. In summary, the mixed methods approach that was used clearly enabled the research objectives to be comprehensively addressed in an informed sequential manner, and, as a consequence, this thesis has generated a substantial amount of new knowledge to inform and support nurse-led sedation practice in the CCL context. However, a limitation of the research to note is that the comprehensive appraisal of the evidence conducted, combined with the guideline development process, highlighted that there were numerous deficiencies in the evidence base. As such, rather than being based on high-level evidence, many of the recommendations for practice were produced by consensus. For this reason, further research is required in order to ascertain which specific practices result in the most optimal patient and health service outcomes. Therefore, along with necessary guideline implementation and evaluation projects, post-doctoral research is planned to follow up on the research gaps identified, which are planned to form part of a continuing program of research in this field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the increasing number of immigrants, there is a limited body of literature describing the use of hospital emergency department (ED) care by immigrants in Australia. This study aims to describe how immigrants from refugee source countries (IRSC) utilise ED care, compared to immigrants from the main English speaking countries (MESC), immigrants from other countries (IOC) and the local population in Queensland. A retrospective analysis of a Queensland state-wide hospital ED dataset (ED Information System) from 1-1-2008 to 31-12-2010 was conducted. Our study showed that immigrants are not a homogenous group. We found that immigrants from IRSC are more likely to use interpreters (8.9%) in the ED compared to IOC. Furthermore, IRSC have a higher rate of ambulance use (odds ratio 1.2, 95% confidence interval (CI) 1.2–1.3), are less likely to be admitted to the hospital from the ED (odds ratio 0.7 (95% CI 0.7–0.8), and have a longer length of stay (LOS; mean differences 33.0, 95% CI 28.8–37.2), in minutes, in the ED compared to the Australian born population. Our findings highlight the need to develop policies and educational interventions to ensure the equitable use of health services among vulnerable immigrant populations.