766 resultados para patient health questionnaire
Resumo:
Background: De-institutionalization of psychiatric patients has led to a greater emphasis on family management in the community, and family members are often overwhelmed by the demands that caring for a patient with schizophrenia involves. Most studies of family burden in schizophrenia have taken place in developed countries. The current study examined family burden and its correlates in a regional area of a medium income country in South America. Method: Sixty-five relatives of patients with schizophrenia who were attending a public mental health out-patient service in the province of Arica, Chile, were assessed on Spanish versions of the Zarit Caregiver Burden Scale and SF-36 Health Survey (SF-36). Results: Average levels of burden were very high, particularly for mothers, carers with less education, carers of younger patients and carers of patients with more hospitalisations in the previous 3 years. Kinship and number of recent hospitalisations retained unique predictive variance in a multiple regression. Burden was the strongest predictor of SF-36 subscales, and the prediction from burden remained significant after entry of other potential predictors. Conclusions: In common with families in developed countries, family members of schizophrenia patients in regional Chile reported high levels of burden and related functional and health impact. The study highlighted the support needs of carers in contexts with high rates of poverty and limited health and community resources.
Resumo:
Introduction: Five-year survival from breast cancer in Australia is 87%. Hence, ensuring a good quality of life (QOL) has become a focal point of cancer research and clinical interest. Exercise during and after treatment has been identified as a potential strategy to optimise QOL of women diagnosed with breast cancer.----- Methods: Exercise for Health is a randomised controlled trial of an eight-month, exercise intervention delivered by Exercise Physiologists. An objective of this study was to assess the impact of the exercise program during and following treatment on QOL. Queensland women diagnosed with unilateral breast cancer in 2006/07 were eligible to participate. Those living in urban-Brisbane (n=194) were allocated to either the face-to-face exercise group, the telephone exercise group, or the usual-care group, and those living in rural Queensland (n=143) were allocated to the telephone exercise group or the usual-care group. QOL, as assessed by the Functional Assessment of Cancer Therapy-Breast (FACT-B+4) questionnaire, was measured at 4-6 weeks (pre-intervention), 6 months (mid-intervention) and 12 months (three months post-intervention) post-surgery.----- Results: Significant (P<0.01) increases in QOL were observed between pre-intervention and three months post-intervention 12 months post-surgery for all women. Women in the exercise groups experienced greater mean positive changes in QOL during this time (+10 points) compared with the usual-care groups (+5 to +7 points) after adjusting for baseline QOL. Although all groups experienced an overall increase in QOL, approximately 20% of urban and rural women in the usual-care groups reported a decline in QOL, compared with 10% of women in the exercise groups.----- Conclusions: This work highlights the potential importance of participating in physical activity to optimise QOL following a diagnosis of breast cancer. Results suggest that the telephone may be an effective medium for delivering exercise counselling to newly diagnosed breast cancer patients.
Resumo:
Research on outcomes from psychiatric disorders has highlighted the importance of expressed emotion (EE), but its cost-effective measurement remains a challenge. This article describes development of the Family Attitude Scale (FAS), a 30-item instrument that can be completed by any informant. Its psychometric characteristics are reported in parents of undergraduate students and in 70 families with a schizophrenic member. The total FAS had high internal consistency in all samples, and reports of angry behaviour in FAS items showed acceptable inter-rater agreement. The FAS was associated with the reported anger, anger expression and anxiety of respondents. Substantial associations between the parents' FAS and the anger and anger expression of students was also observed. Parents of schizophrenic patients had higher FAS scores than parents of students, and the FAS was higher if disorder duration was longer or patient functioning was poorer. Hostility, high criticism and low warmth on the Camberwell Family Interview (CFI) were associated with a more negative FAS. The highest FAS in the family was a good predictor of a highly critical environment on the CFI. The FAS is a reliable and valid indicator of relationship stress and expressed anger that has wide applicability.
What determines the health-related quality of life among regional and rural breast cancer survivors?
Resumo:
Objective: To assess the health-related quality of life (HRQoL) of regional and rural breast cancer survivors at 12 months post-diagnosis and to identify correlates of HRQoL. Methods: 323 (202 regional and 121 rural) Queensland women diagnosed with unilateral breast cancer in 2006/2007 participated in a population-based, cross-sectional study. HRQoL was measured using the Functional Assessment of Cancer Therapy, Breast plus arm morbidity (FACT-B+4) self-administered questionnaire. Results: In age-adjusted analyses, mean HRQoL scores of regional breast cancer survivors were comparable to their rural counterparts 12 months post-diagnosis (122.9, 95% CI: 119.8, 126.0 vs. 123.7, 95% CI: 119.7, 127.8; p>0.05). Irrespective of residence, younger (<50 years) women reported lower HRQoL than older (50+ years) women (113.5, 95% CI: 109.3, 117.8 vs. 128.2, 95%CI: 125.1, 131.2; p<0.05). Those women who received chemotherapy, reported two complications post-surgery, had poorer upper-body function than most, reported more stress, reduced coping, who were socially isolated, had no confidante for social-emotional support, had unmet healthcare needs, and low health self-efficacy reported lower HRQoL scores. Together, these factors explained 66% of the variance in overall HRQoL. The pattern of results remained similar for younger and older age groups. Conclusions and Implications: The results underscore the importance of supporting and promoting regional and rural breast cancer programs that are designed to improve physical functioning, reduce stress and provide psychosocial support following diagnosis. Further, the information can be used by general practitioners and other allied health professionals for identifying women at risk of poorer HRQoL.
Resumo:
Objective: To provide a systematic review of papers comparing the effectiveness of different strategies to recruit older adults (aged 50 years and over) to participate in health research studies, to guide successful recruitment in future research. Methods: Four major databases were searched for papers published between 1995 and 2008 with: target group aged 50 years or over; participants allocated to receive one of two or more recruitment strategies; and an outcome measure of response rate or enrolment in study. Results: Twelve papers were included in the review. Conclusion: For postal questionnaires, recruitment strategies used with older adults had comparable outcomes to those used to recruit from the general population. For other types of studies, strategies involving face-to-face contact may be more effective than indirect methods, but this needs to be balanced against feasibility. Overall, little evidence on the topic exists and more rigorous investigation is necessary.
Resumo:
Objective: To systematically review the published evidence of the impact of health information technology (HIT) on the quality of medical and health care specifically clinicians’ adherence to evidence-based guidelines and the corresponding impact this had on patient clinical outcomes. In order to be as inclusive as possible the research examined literature discussing the use of health information technologies and systems in both medical care such as clinical and surgical, and other health care such as allied health and preventive services.----- Design: Systematic review----- Data Sources: Relevant literature was systematically searched on English language studies indexed in MEDLINE and CINAHL(1998 to 2008), Cochrane Library, PubMed, Database of Abstracts of Review of Effectiveness (DARE), Google scholar and other relevant electronic databases. A search for eligible studies (matching the inclusion criteria) was also performed by searching relevant conference proceedings available through internet and electronic databases, as well as using reference lists identified from cited papers.----- Selection criteria: Studies were included in the review if they examined the impact of Electronic Health Record (EHR), Computerised Provider Order-Entry (CPOE), or Decision Support System (DS); and if the primary outcomes of the studies were focused on the level of compliance with evidence-based guidelines among clinicians. Measures could be either changes in clinical processes resulting from a change of the providers’ behaviour or specific patient outcomes that demonstrated the effectiveness of a particular treatment given by providers. ----- Methods: Studies were reviewed and summarised in tabular and text form. Due to heterogeneity between studies, meta-analysis was not performed.----- Results: Out of 17 studies that assessed the impact of health information technology on health care practitioners’ performance, 14 studies revealed a positive improvement in relation to their compliance with evidence-based guidelines. The primary domain of improvement was evident from preventive care and drug ordering studies. Results from the studies that included an assessment for patient outcomes however, were insufficient to detect either clinically or statistically important improvements as only a small proportion of these studies found benefits. For instance, only 3 studies had shown positive improvement, while 5 studies revealed either no change or adverse outcomes.----- Conclusion: Although the number of included studies was relatively small for reaching a conclusive statement about the effectiveness of health information technologies and systems on clinical care, the results demonstrated consistency with other systematic reviews previously undertaken. Widescale use of HIT has been shown to increase clinician’s adherence to guidelines in this review. Therefore, it presents ongoing opportunities to maximise the uptake of research evidence into practice for health care organisations, policy makers and stakeholders.
Resumo:
Background: Reducing rates of healthcare acquired infection has been identified by the Australian Commission on Safety and Quality in Health Care as a national priority. One of the goals is the prevention of central venous catheter-related bloodstream infection (CR-BSI). At least 3,500 cases of CR-BSI occur annually in Australian hospitals, resulting in unnecessary deaths and costs to the healthcare system between $25.7 and $95.3 million. Two approaches to preventing these infections have been proposed: use of antimicrobial catheters (A-CVCs); or a catheter care and management ‘bundle’. Given finite healthcare budgets, decisions about the optimal infection control policy require consideration of the effectiveness and value for money of each approach. Objectives: The aim of this research is to use a rational economic framework to inform efficient infection control policy relating to the prevention of CR-BSI in the intensive care unit. It addresses three questions relating to decision-making in this area: 1. Is additional investment in activities aimed at preventing CR-BSI an efficient use of healthcare resources? 2. What is the optimal infection control strategy from amongst the two major approaches that have been proposed to prevent CR-BSI? 3. What uncertainty is there in this decision and can a research agenda to improve decision-making in this area be identified? Methods: A decision analytic model-based economic evaluation was undertaken to identify an efficient approach to preventing CR-BSI in Queensland Health intensive care units. A Markov model was developed in conjunction with a panel of clinical experts which described the epidemiology and prognosis of CR-BSI. The model was parameterised using data systematically identified from the published literature and extracted from routine databases. The quality of data used in the model and its validity to clinical experts and sensitivity to modelling assumptions was assessed. Two separate economic evaluations were conducted. The first evaluation compared all commercially available A-CVCs alongside uncoated catheters to identify which was cost-effective for routine use. The uncertainty in this decision was estimated along with the value of collecting further information to inform the decision. The second evaluation compared the use of A-CVCs to a catheter care bundle. We were unable to estimate the cost of the bundle because it is unclear what the full resource requirements are for its implementation, and what the value of these would be in an Australian context. As such we undertook a threshold analysis to identify the cost and effectiveness thresholds at which a hypothetical bundle would dominate the use of A-CVCs under various clinical scenarios. Results: In the first evaluation of A-CVCs, the findings from the baseline analysis, in which uncertainty is not considered, show that the use of any of the four A-CVCs will result in health gains accompanied by cost-savings. The MR catheters dominate the baseline analysis generating 1.64 QALYs and cost-savings of $130,289 per 1.000 catheters. With uncertainty, and based on current information, the MR catheters remain the optimal decision and return the highest average net monetary benefits ($948 per catheter) relative to all other catheter types. This conclusion was robust to all scenarios tested, however, the probability of error in this conclusion is high, 62% in the baseline scenario. Using a value of $40,000 per QALY, the expected value of perfect information associated with this decision is $7.3 million. An analysis of the expected value of perfect information for individual parameters suggests that it may be worthwhile for future research to focus on providing better estimates of the mortality attributable to CR-BSI and the effectiveness of both SPC and CH/SSD (int/ext) catheters. In the second evaluation of the catheter care bundle relative to A-CVCs, the results which do not consider uncertainty indicate that a bundle must achieve a relative risk of CR-BSI of at least 0.45 to be cost-effective relative to MR catheters. If the bundle can reduce rates of infection from 2.5% to effectively zero, it is cost-effective relative to MR catheters if national implementation costs are less than $2.6 million ($56,610 per ICU). If the bundle can achieve a relative risk of 0.34 (comparable to that reported in the literature) it is cost-effective, relative to MR catheters, if costs over an 18 month period are below $613,795 nationally ($13,343 per ICU). Once uncertainty in the decision is considered, the cost threshold for the bundle increases to $2.2 million. Therefore, if each of the 46 Level III ICUs could implement an 18 month catheter care bundle for less than $47,826 each, this approach would be cost effective relative to A-CVCs. However, the uncertainty is substantial and the probability of error in concluding that the bundle is the cost-effective approach at a cost of $2.2 million is 89%. Conclusions: This work highlights that infection control to prevent CR-BSI is an efficient use of healthcare resources in the Australian context. If there is no further investment in infection control, an opportunity cost is incurred, which is the potential for a more efficient healthcare system. Minocycline/rifampicin catheters are the optimal choice of antimicrobial catheter for routine use in Australian Level III ICUs, however, if a catheter care bundle implemented in Australia was as effective as those used in the large studies in the United States it would be preferred over the catheters if it was able to be implemented for less than $47,826 per Level III ICU. Uncertainty is very high in this decision and arises from multiple sources. There are likely greater costs to this uncertainty for A-CVCs, which may carry hidden costs, than there are for a catheter care bundle, which is more likely to provide indirect benefits to clinical practice and patient safety. Research into the mortality attributable to CR-BSI, the effectiveness of SPC and CH/SSD (int/ext) catheters and the cost and effectiveness of a catheter care bundle in Australia should be prioritised to reduce uncertainty in this decision. This thesis provides the economic evidence to inform one area of infection control, but there are many other infection control decisions for which information about the cost-effectiveness of competing interventions does not exist. This work highlights some of the challenges and benefits to generating and using economic evidence for infection control decision-making and provides support for commissioning more research into the cost-effectiveness of infection control.
Resumo:
Few studies have evaluated the reliability of lifetime sun exposure estimated from inquiring about the number of hours people spent outdoors in a given period on a typical weekday or weekend day (the time-based approach). Some investigations have suggested that women have a particularly difficult task in estimating time outdoors in adulthood due to their family and occupational roles. We hypothesized that people might gain additional memory cues and estimate lifetime hours spent outdoors more reliably if asked about time spent outdoors according to specific activities (an activity-based approach). Using self-administered, mailed questionnaires, test-retest responses to time-based and to activity-based approaches were evaluated in 124 volunteer radiologic technologist participants from the United States: 64 females and 60 males 48 to 80 years of age. Intraclass correlation coefficients (ICC) were used to evaluate the test-retest reliability of average number of hours spent outdoors in the summer estimated for each approach. We tested the differences between the two ICCs, corresponding to each approach, using a t test with the variance of the difference estimated by the jackknife method. During childhood and adolescence, the two approaches gave similar ICCs for average numbers of hours spent outdoors in the summer. By contrast, compared with the time-based approach, the activity-based approach showed significantly higher ICCs during adult ages (0.69 versus 0.43, P = 0.003) and over the lifetime (0.69 versus 0.52, P = 0.05); the higher ICCs for the activity-based questionnaire were primarily derived from the results for females. Research is needed to further improve the activity-based questionnaire approach for long-term sun exposure assessment. (Cancer Epidemiol Biomarkers Prev 2009;18(2):464–71)
Resumo:
Introduction. In adults, oral health has been shown to worsen during critical illness as well as influence systemic health. There is a paucity of paediatric critical care research in the area of oral health; hence the purpose of the Critically ill Children’s Oral Health (CCOH) study is to describe the status of oral health of critically ill children over time spent in the paediatric intensive care unit (PICU). The study will also examine the relationship between poor oral health and a variety of patient characteristics and PICU therapies and explore the relationship between dysfunctional oral health and PICU related Healthcare-Associated Infections (HAI). Method. An observational study was undertaken at a single tertiary-referral PICU. Oral health was measured using the Oral Assessment Scale (OAS) and culturing oropharyngeal flora. Information was also collected surrounding the use of supportive therapies, clinical characteristics of the children and the occurrence of PICU related HAI. Results. Forty-six participants were consecutively recruited to the CCOH study. Of the participants 63% (n=32) had oral dysfunction while 41% (n=19) demonstrated pathogenic oropharyngeal colonisation during their critical illness. The potential systemic pathogens isolated from the oropharynx and included Candida sp., Staphylococcus aureus, Haemophilus influenzae, Enterococcus sp. and Pseudomonas aeruginosa. The severity of critical illness had a significant positive relationship (p=0.046) with pathogenic and absent colonisation of the oropharynx. Sixty-three percent of PICU-related HAI involved the preceding or simultaneous colonisation of the oropharynx by the causative pathogen. Conclusion. Given the prevalence of poor oral health during childhood critical illness and the subsequent potential systemic consequences, evidence based oral hygiene practices should be developed and validated to guide clinicians when nursing critically ill children.
Resumo:
Objective: To evaluate the importance of contextual and policy factors on nurses’ judgment about medication administration practice.---------- Design: A questionnaire survey of responses to a number of factorial vignettes in June 2004. These vignettes considered a combination of seven contextual and policy factors that were thought to influence nurses’ judgments relating to medication administration.---------- Participants: 185 (67% of eligible) clinical paediatric nursing staff returned completed questionnaires.--------- Setting: A tertiary paediatric hospital in Brisbane, Australia.---------- Results: Double checking the patient, double checking the drug and checking the legality of the prescription were the three strongest predictors of nurses’ actions regarding medication administration.--------- Conclusions: Policy factors and not contextual factors drive nurses’ judgment in response to hypothetical scenarios.
Resumo:
Background: The effect of patient education on reducing stroke has had mixed effects, raising questions about how to achieve optimal benefit. Because past evaluations have typically lacked an appropriate theoretical base, the design of past research may have missed important effects. --------- Method: This study used a social cognitive framework to identify variables that might change in response to education. A mixed design was used to evaluate two approaches to an intervention, both of which included education. Fifty seniors completed a measure of stroke knowledge and beliefs twice: before and after an intervention that was either standard (educational brochure plus activities that were not about stroke) or enhanced (educational brochure plus activities designed to enhance beliefs about stroke). Outcome measures were health beliefs, intention to exercise to reduce stroke, and stroke knowledge. --------- Results: Selected beliefs changed significantly over time but not differentially across conditions. Beliefs that changed were (a) perceived susceptibility to stroke and (b) perceived benefit of exercise to reduce risk. Benefit beliefs, in particular, were strongly and positively associated with intention to exercise. -------- Conclusion: Findings suggest that basic approaches to patient education may influence health beliefs. More effective stroke prevention programs may result from continued consideration of the role of health beliefs in such programs.
Resumo:
The relationship between deformity correction and self-reported patient satisfaction after thoracoscopic anterior scoliosis surgery is unknown. Scoliosis Research Society questionnaire scores, radiographic outcomes, and rib hump correction were prospectively assessed for a group of 100 patients pre-operatively and at two years after surgery. Patients with lower post-op major Cobb angles report significantly higher SRS scores than patients with higher post-op Cobb angles.
Resumo:
Background The preservation of meniscal tissue is important to protect joint surfaces. Purpose We have an aggressive approach to meniscal repair, including repairing tears other than those classically suited to repair. Here we present the medium- to long-term outcome of meniscal repair (inside-out) in elite athletes. Study Design Case series; Level of evidence, 4. Methods Forty-two elite athletes underwent 45 meniscal repairs. All repairs were performed using an arthroscopically assisted inside-out technique. Eighty-three percent of these athletes had ACL reconstruction at the same time. Patients returned a completed questionnaire (including Lysholm and International Knee Documentation Committee [IKDC] scores). Mean follow-up was 8.5 years. Failure was defined by patients developing symptoms of joint line pain and/or locking or swelling requiring repeat arthroscopy and partial meniscectomy. Results The average Lysholm and subjective IKDC scores were 89.6 and 85.4, respectively. Eighty-one percent of patients returned to their main sport and most to a similar level at a mean time of 10.4 months after repair, reflecting the high level of ACL reconstruction in this group. We identified 11 definite failures, 10 medial and 1 lateral meniscus, that required excision; this represents a 24% failure rate. We identified 1 further patient who had possible failed repairs, giving a worst-case failure rate of 26.7% at a mean of 42 months after surgery. However, 7 of these failures were associated with a further injury. Therefore, the atraumatic failure rate was 11%. Age and size and location of the tears were not associated with a higher failure rate. Medial meniscal repairs were significantly more likely to fail than lateral meniscal repairs, with a failure rate of 36.4% and 5.6%, respectively (P < .05). Conclusion Meniscal repair and healing are possible, and most elite athletes can return to their preinjury level of activity.