873 resultados para Chronic pain -- Chemotherapy
Resumo:
Background: Ureaplasma species in amniotic fluid at the time of second-trimester amniocentesis increases the risk of preterm birth, but most affected pregnancies continue to term (Gerber et al. J Infect Dis 2003). We aimed to model intra-amniotic (IA) ureaplasma infection in spiny mice, a species with a relatively long gestation (39 days) that allows investigation of the disposition and possible clearance of ureaplasmas in the feto-placental compartment. Method: Pregnant spiny mice received IA injections of U. parvum serovar 6 (10µL, 1x104 colony-forming-units in PBS) or 10B media (10µL; control) at 20 days (d) of gestation (term=39d). At 37d fetuses (n=3 ureaplasma, n=4 control) were surgically delivered and tissues were collected for; bacterial culture, ureaplasma mba and urease gene expression by PCR, tissue WBC counts and indirect fluorescent antibody (IFA) staining using anti-ureaplasma serovar 6 (rabbit) antiserum. Maternal and fetal plasma IgG was measured by Western blot. Results: Ureaplasmas were not detected by culture or PCR in fetal or maternal tissues but were visualized by IFA within placental and fetal lung tissues, in association with inflammatory changes and elevated WBC counts (p<0.0001). Anti-ureaplasma IgG was detected in maternal (2/2 tested) and fetal (1/2 tested) plasma but not in controls (0/3). Conclusions: IA injection of ureaplasmas in mid-gestation spiny mice caused persistent fetal lung and placental infection even though ureaplasmas were undetectable using standard culture or PCR techniques. This is consistent with resolution of IA infection, which may occur in human pregnancies that continue to term despite detection of ureaplasmas in mid-gestation.
Resumo:
Background: We have developed a sheep model of intrauterine ureaplasma infection. We aimed to examine the capability of ureaplasmas in the amniotic fluid to infect the fetus and alter fetal development...
Resumo:
Interaction between the endocrine and immune system is necessary to regulate our health. However, under some conditions, stress hormones can overstimulate or suppress the immune system, resulting in harmful consequences (1). Stress is often considered negative, yet it is an intrinsic part of everyday life. Stress is not clearly defined; it is context-specific and depends on the nature of factors that challenge our body. Internal stimuli will elicit different stress reactions compared with external stimuli (1). Similarly, some stressors will induce responses that may benefit survival, whereas others will cause disturbances that may endanger our health. Stress also depends on how our bodies perceive and respond to stressful stimuli (1).
Resumo:
International research on prisoners demonstrates poor health outcomes, including chronic disease, with the overall burden to the community high. Prisoners are predominantly male and young. In Australia, the average incarceration length is 3 years, sufficient to impact long term health, including nutrition. Food in prisons is highly controlled, yet gaps exist in policy. In most Western countries prisons promote healthy foods, often incongruent with prisoner expectations or wants. Few studies have been conducted on dietary intakes during incarceration in relation to food policy. In this study detailed diet histories were collected on 120/945 men (mean age = 32 years), in a high-secure prison. Intakes were verified via individual purchase records, mealtime observations, and audits of food preparation, purchasing and holdings. Physical measurements (including fasting bloods) were taken and medical records reviewed. Results showed the standard food provided consistent with current dietary guidelines, however limited in menu choice. Diet histories revealed self-funded foods contributing 1–63% of energy (mean = 30%), 0–83% sugar (mean = 38%), 1–77% saturated fats (mean = 31%) and 1–59% sodium (mean = 23%). High levels of modification to food provided was found using minimal cooking amenities and inclusion of self-funded foods and/or foods retained from previous meals. Medical records and physical measurements confirmed markers of chronic disease. This study highlights the need to establish clear guidelines on all food available in prisons if chronic disease risk reduction is a goal. This study has also supported evidenced based food and nutrition policy including menu choice, food quality, quantity and safety as well as type and access to self-funded foods.
Resumo:
Context: Very few authors have investigated the relationship between hip-abductor muscle strength and frontal-plane knee mechanics during running. Objective: To investigate this relationship using a 3-week hip-abductor muscle-strengthening program to identify changes in strength, pain, and biomechanics in runners with patellofemoral pain syndrome (PFPS). Design: Cohort study. Setting: University-based clinical research laboratory. Patients or Other Participants: Fifteen individuals (5 men, 10 women) with PFPS and 10 individuals without PFPS (4 men, 6 women) participated. Intervention(s): The patients with PFPS completed a 3-week hip-abductor strengthening protocol; control participants did not. Main Outcome Measure(s): The dependent variables of interest were maximal isometric hip-abductor muscle strength, 2-dimensional peak knee genu valgum angle, and stride-to-stride knee-joint variability. All measures were recorded at baseline and 3 weeks later. Between-groups differences were compared using repeated-measures analyses of variance. Results: At baseline, the PFPS group exhibited reduced strength, no difference in peak genu valgum angle, and increased stride-to-stride knee-joint variability compared with the control group. After the 3-week protocol, the PFPS group demonstrated increased strength, less pain, no change in peak genu valgum angle, and reduced stride-to-stride knee-joint variability compared with baseline. Conclusions: A 3-week hip-abductor muscle-strengthening protocol was effective in increasing muscle strength and decreasing pain and stride-to-stride knee-joint variability in individuals with PFPS. However, concomitant changes in peak knee genu valgum angle were not observed.
Resumo:
Context: It has been theorized that a positive Trendelenburg test (TT) indicates weakness of the stance hip-abductor (HABD) musculature, results in contralateral pelvic drop, and represents impaired load transfer, which may contribute to low back pain. Few studies have tested whether weakness of the HABDs is directly related to the magnitude of pelvic drop (MPD). Objective: To examine the relationship between HABD strength and MPD during the static TT and during walking for patients with nonspecific low back pain (NSLBP) and healthy controls (CON). A secondary purpose was to examine this relationship in NSLBP after a 3-wk HABD-strengthening program. Design: Quasi-experimental. Setting: Clinical research laboratory. Participants: 20 (10 NSLBP and 10 CON). Intervention: HABD strengthening. Main Outcome Measures: Normalized HABD strength, MPD during TT, and maximal pelvic frontal-plane excursion during walking. Results: At baseline, the NSLBP subjects were significantly weaker (31%; P = .03) than CON. No differences in maximal pelvic frontal-plane excursion (P = .72), right MPD (P = 1.00), or left MPD (P = .40) were measured between groups. During the static TT, nonsignificant correlations were found between left HABD strength and right MPD for NSLBP (r = -.32, P = .36) and CON (r = -.24, P = .48) and between right HABD strength and left MPD for NSLBP (r = -.24, P = .50) and CON (r = -.41, P = .22). Nonsignificant correlations were found between HABD strength and maximal pelvic frontal-plane excursion for NSLBP (r = -.04, P = .90) and CON (r = -.14, P = .68). After strengthening, NSLBP demonstrated significant increases in HABD strength (12%; P = .02), 48% reduction in pain, and no differences in MPD during static TT and maximal pelvic frontal-plane excursion compared with baseline. Conclusions: HABD strength was poorly correlated to MPD during the static TT and during walking in CON and NSLBP. The results suggest that HABD strength may not be the only contributing factor in controlling pelvic stability, and the static TT has limited use as a measure of HABD function.
Resumo:
Purpose: To examine the relationship between hip abductor muscle (HABD) strength and the magnitude of pelvic drop (MPD) for patients with non-specific low back pain (NSLBP) and controls (CON) prior to and following a 3-week HABD strengthening protocol. At baseline, we hypothesized that NSLBP patients would exhibit reduced HABD strength and greater MPD compared to CON. Following the protocol, we hypothesized that strength would increase and MPD would decrease. Relevance: The Trendelenburg test (TT) is a common clinical test used to examine the ability of the HABD to maintain horizontal pelvic position during single limb stance. However, no study has specifically tested this theory. Moreover, no study has investigated the relationship between HABD strength and pelvic motion during walking or tested whether increased HABD strength would reduce the MPD. Methods: Quasi-experimental with 3-week exercise intervention. Fifteen NSLBP patients (32.5yrs,range 21-51yrs; VAS baseline: 5.3cm) and 10 CON (29.5yrs,range 22-47yrs) were recruited. Isometric HABD strength was measured using a force dynamometer and the average of three maximal voluntary contractions were normalized to body mass (N/kg). Two-dimensional MPD (degrees) was measured using a 60 Hz camera and was derived from two retroreflective-markers placed on the posterior superior iliac spines. MPD was measured while performing the static TT and while walking and averaged over 10 consecutive footfalls. NSLBP patients completed a 3-week HABD strengthening protocol consisting of 2 open-kinetic-chain exercises then all measures were repeated. Non-parametric analysis was used for group comparisons and correlation analysis. Results: At baseline, the NSLBP patients demonstrated 31% reduced HABD strength (mean=6.6 N/kg) compared to CON (mean=9.5 N/kg: p=0.03) and no significant differences in maximal pelvic frontal plane excursion while walking (NSLBP:mean=8.1°, CON:mean=7.1°: p=0.72). No significant correlations were measured between left HABD strength and right MPD (r=-0.37, p=0.11), or between right HABD strength and left MPD (r=-0.04, p=0.84) while performing the static TT. Following the 3-week strengthening protocol, NSLBP patients demonstrated a 12% improvement in strength (Post:mean=7.4 N/kg: p=0.02), a reduction in pain (VAS followup: 2.8cm), but no significant decreases in MPD while walking (p=0.92). Conclusions: NSLBP patients demonstrated reduced HABD strength at baseline and were able to increase strength and reduce pain in a 3-week period. However, despite increases in HABD strength, the NSLBP group exhibited similar MPD motion during the static TT and while walking compared to baseline and controls. Implications: The results suggest that the HABD alone may not be primarily responsible for controlling a horizontal pelvic position during static and dynamic conditions. Increasing the strength of the hip abductors resulted in a reduction of pain in NSLBP patients providing evidence for further research to identify specific musculature responsible for controlling pelvic motion.
Resumo:
Background The largest proportion of cancer patients are aged 65 years and over. Increasing age is also associated with nutritional risk and multi-morbidities—factors which complicate the cancer treatment decision-making process in older patients. Objectives To determine whether malnutrition risk and Body Mass Index (BMI) are associated with key oncogeriatric variables as potential predictors of chemotherapy outcomes in geriatric oncology patients with solid tumours. Methods In this longitudinal study, geriatric oncology patients (aged ≥65 years) received a Comprehensive Geriatric Assessment (CGA) for baseline data collection prior to the commencement of chemotherapy treatment. Malnutrition risk was assessed using the Malnutrition Screening Tool (MST) and BMI was calculated using anthropometric data. Nutritional risk was compared with other variables collected as part of standard CGA. Associations were determined by chi-square tests and correlations. Results Over half of the 175 geriatric oncology patients were at risk of malnutrition (53.1%) according to MST. BMI ranged from 15.5–50.9kg/m2, with 35.4% of the cohort overweight when compared to geriatric cutoffs. Malnutrition risk was more prevalent in those who were underweight (70%) although many overweight participants presented as at risk (34%). Malnutrition risk was associated with a diagnosis of colorectal or lung cancer (p=0.001), dependence in activities of daily living (p=0.015) and impaired cognition (p=0.049). Malnutrition risk was positively associated with vulnerability to intensive cancer therapy (rho=0.16, p=0.038). Larger BMI was associated with a greater number of multi-morbidities (rho =.27, p=0.001. Conclusions Malnutrition risk is prevalent among geriatric patients undergoing chemotherapy, is more common in colorectal and lung cancer diagnoses, is associated with impaired functionality and cognition and negatively influences ability to complete planned intensive chemotherapy.
Resumo:
Clinicians often report that currently available methods to assess older patients, including standard clinical consultations, do not elicit the information necessary to make an appropriate cancer treatment recommendation for older cancer patients. An increasingly popular way of assessing the potential of older patients to cope with chemotherapy is a Comprehensive Geriatric Assessment. What constitutes Comprehensive Geriatric Assessment, however, is open to interpretation and varies from one setting to another. Furthermore, Comprehensive Geriatric Assessment’s usefulness as a predictor of fitness for chemotherapy and as a determinant of actual treatment is not well understood. In this article, we analyse how Comprehensive Geriatric Assessment was developed for use in a large cancer service in an Australian capital city. Drawing upon Actor–Network Theory, our findings reveal how, during its development, Comprehensive Geriatric Assessment was made both a tool and a science. Furthermore, we briefly explore the tensions that we experienced as scholars who analyse medico-scientific practices and as practitioner–designers charged with improving the very tools we critique. Our study contributes towards geriatric oncology by scrutinising the medicalisation of ageing, unravelling the practices of standardisation and illuminating the multiplicity of ‘fitness for chemotherapy’.
Resumo:
Background: Malnutrition before and during chemotherapy is associated with poor treatment outcomes. The risk of cancer-related malnutrition is exacerbated by common nutrition impact symptoms during chemotherapy, such as nausea, diarrhoea and mucositis. Aim of presentation: To describe the prevalence of malnutrition/ malnutrition risk in two samples of patients treated in a quaternary-level chemotherapy unit. Research design: Cross sectional survey. Sample 1: Patients ≥ 65 years prior to chemotherapy treatment (n=175). Instrument: Nurse-administered Malnutrition Screening Tool to screen for malnutrition risk and body mass index (BMI). Sample 2: Patients ≥ 18 years receiving chemotherapy (n=121). Instrument: Dietitian-administered Patient Generated Subjective Global Assessment to assess malnutrition, malnutrition risk and BMI. Findings Sample 1: 93/175 (53%) of older patients were at risk of malnutrition prior to chemotherapy. 27 (15%) were underweight (BMI <21.9); 84 (48%) were overweight (BMI >27). Findings Sample 2: 31/121 patients (26%) were malnourished; 12 (10%) had intake-limiting nausea or vomiting; 22 (20%) reported significant weight loss; and 20 (18%) required improved nutritional symptom management during treatment. 13 participants with malnutrition/nutrition impact symptoms (35%) had no dietitian contact; the majority of these participants were overweight. Implications for nursing: Patients with, or at risk of, malnutrition before and during chemotherapy can be overlooked, particularly if they are overweight. Older patients seem particularly at risk. Nurses can easily and quickly identify risk with the regular use of the Malnutrition Screening Tool, and refer patients to expert dietetic support, to ensure optimal treatment outcomes.
Resumo:
Background: The Vulnerable Elders Survey-13 (VES-13) is increasingly used to screen for older patients who can proceed to intensive chemotherapy without further comprehensive assessment. This study compared the VES-13 determination of fitness for treatment with the oncologist's assessments of fitness. Method: Sample: Consecutive series of solid tumour patients ≥65 years (n=175; M=72; range=65-86) from an Australian cancer centre. Patients were screened with the VES-13 before proceeding to usual treatment. Blinded to screening, oncologists concurrently predicted patient fitness for chemotherapy. A sample of 175 can detect, with 90% power, kappa coefficients of agreement between VES-13 and oncologists’ assessments >0.90 ("almost perfect agreement"). Separate backward stepwise logistic regression analyses assessed potential predictors of VES-13 and oncologists’ ratings of fitness. Results: Kappa coefficient for agreement between VES-13 and oncologists’ ratings of fitness was 0.41 (p<0.001). VES-13 and oncologists’ assessments agreed in 71% of ratings. VES-13 sensitivity = 83.3%; specificity = 57%; positive predictive value = 69%; negative predictive value = 75%. Logistic regression modelling indicated that the odds of being vulnerable to chemotherapy (VES-13) increased with increasing depression (OR=1.42; 95% CI: 1.18, 1.71) and decreased with increased functional independence assessed on the Bartel Index (OR=0.82; CI: 0.74, 0.92) and Lawton instrumental activities of daily living (OR=0.44; CI: 0.30, 0.65); RSquare=.65. Similarly, the odds of a patient being vulnerable to chemotherapy, when assessed by physicians, increased with increasing age (OR=1.15; CI: 1.07, 1.23) and depression (OR=1.23; CI: 1.06, 1.43), and decreased with increasing functional independence (OR=0.91; CI: 0.85, 0.98); RSquare=.32. Conclusions: Our data indicate moderate agreement between VES-13 and clinician assessments of patients’ fitness for chemotherapy. Current ‘one-step’ screening processes to determine fitness have limits. Nonetheless, screening tools do have the potential for modification and enhanced predictive properties in cancer care by adding relevant items, thus enabling fit patients to be immediately referred for chemotherapy.
Resumo:
Purpose: To present the results of a mixed-method study comparing the level of agreement of a two-phased, nurse-administered Comprehensive Geriatric Assessment (CGA) with current methods that assess the fitness for chemotherapy of older cancer patients. A nurse-led model of multidisciplinary cancer care based on the results is also described. Methods: The two phases comprised initial screening by a nurse with the Vulnerable Elders Survey-13 [VES-13], followed by nurse administration of a detailed CGA. Both phases were linked to a computerised algorithm categorising the patient as ‘fit’, ‘vulnerable’ or ‘frail’. The study determined the level of agreement between VES-13- and CGA-determined categories; and between the CGA and the physicians’ assessments. It also compared the CGA’s predictive abilities in terms of subsequent treatment toxicity; while interviews determined the acceptability of the nurse-led procedure from key stakeholders' perspectives. Results: Data collection was completed in December 2011. The results will be presented at the conference. A consecutive-series n=170 will be enrolled, 33% of whom are ‘fit’; 33% ‘vulnerable’; and 33% ‘too frail’ for treatment. This sample can detect, with 90% power, kappa coefficients of agreement of ≥ 0.70 or higher (“substantial agreement”). Fitness sub-group comparisons of agreement between the medical oncologist and the nurse assessments can detect kappa estimates of Κ ≥ 0.80 with the same power. Conclusion: The results have informed a nurse-led model of cancer care. It meets a clear need to develop, implement and test a nurse-led, robust, evidence-based, clinically-justifiable and economically-feasible CGA process that has relevance in national and international contexts.
Resumo:
Background Thoracoscopic anterior scoliosis instrumentation is a safe and viable surgical option for corrective fusion of progressive adolescent idiopathic scoliosis (AIS) and has been performed at our centre on 205 patients since 2000. However, there is a paucity of literature reporting on or examining optimum methods of analgesia following this type of surgery. A retrospective study was designed to present the authors’ technique for delivering intermittent local anaesthetic boluses via an intrapleural catheter following thoracoscopic scoliosis surgery; report the pain levels that may be expected and any adverse effects associated with the use of intrapleural analgesia, as part of a combined postoperative analgesia regime. Methods Records for 32 patients who underwent thoracoscopic anterior correction for AIS were reviewed. All patients received an intrapleural catheter inserted during surgery, in addition to patient-controlled opiate analgesia and oral analgesia. After surgery, patients received a bolus of 0.25% bupivacaine every four hours via the intrapleural catheter. Patient’s perceptions of their pain control was measured using the visual analogue pain scale scores which were recorded before and after local anaesthetic administration and the quantity and time of day that any other analgesia was taken, were also recorded. Results 28 female and four male patients (mean age 14.5 ± 1.5 years) had a total of 230 boluses of local anaesthetic administered in the 96 hour period following surgery. Pain scores significantly decreased following the administration of a bolus (p < 0.0001), with the mean pain score decreasing from 3.66 to 1.83. The quantity of opiates via patient-controlled analgesia after surgery decreased steadily between successive 24 hours intervals after an initial increase in the second 24 hour period when patients were mobilised. One intrapleural catheter required early removal due to leakage; there were no other associated complications with the intermittent intrapleural analgesia method. Conclusions Local anaesthetic administration via an intrapleural catheter is a safe and effective method of analgesia following thoracoscopic anterior scoliosis correction. Post-operative pain following anterior thoracic scoliosis surgery can be reduced to ‘mild’ levels by combined analgesia regimes. Keywords: Adolescent idiopathic scoliosis; Thoracoscopic anterior spinal fusion; Anterior fusion; Intrapleural analgesia; Endoscopic anterior surgery; Pain relief; Scoliosis surgery
Resumo:
The cardiac catheterisation laboratory (CCL) is a specialised medical radiology facility where both chronic-stable and life-threatening cardiovascular illness is evaluated and treated. Although there are many potential sources of discomfort and distress associated with procedures performed in the CCL, a general anaesthetic is not usually required. For this reason, an anaesthetist is not routinely assigned to the CCL. Instead, to manage pain, discomfort and anxiety during the procedure, nurses administer a combination of sedative and analgesic medications according to direction from the cardiologist performing the procedure. This practice is referred to as nurse-administered procedural sedation and analgesia (PSA). While anecdotal evidence suggested that nurse-administered PSA was commonly used in the CCL, it was clear from the limited information available that current nurse-led PSA administration and monitoring practices varied and that there was contention around some aspects of practice including the type of medications that were suitable to be used and the depth of sedation that could be safely induced without an anaesthetist present. The overall aim of the program of research presented in this thesis was to establish an evidence base for nurse-led sedation practices in the CCL context. A sequential mixed methods design was used over three phases. The objective of the first phase was to appraise the existing evidence for nurse-administered PSA in the CCL. Two studies were conducted. The first study was an integrative review of empirical research studies and clinical practice guidelines focused on nurse-administered PSA in the CCL as well as in other similar procedural settings. This was the first review to systematically appraise the available evidence supporting the use of nurse-administered PSA in the CCL. A major finding was that, overall, nurse-administered PSA in the CCL was generally deemed to be safe. However, it was concluded from the analysis of the studies and the guidelines that were included in the review, that the management of sedation in the CCL was impacted by a variety of contextual factors including local hospital policy, workforce constraints and cardiologists’ preferences for the type of sedation used. The second study in the first phase was conducted to identify a sedation scale that could be used to monitor level of sedation during nurse-administered PSA in the CCL. It involved a structured literature review and psychometric analysis of scale properties. However, only one scale was found that was developed specifically for the CCL, which had not undergone psychometric testing. Several weaknesses were identified in its item structure. Other sedation scales that were identified were developed for the ICU. Although these scales have demonstrated validity and reliability in the ICU, weaknesses in their item structure precluded their use in the CCL. As findings indicated that no existing sedation scale should be applied to practice in the CCL, recommendations for the development and psychometric testing of a new sedation scale were developed. The objective of the second phase of the program of research was to explore current practice. Three studies were conducted in this phase using both quantitative and qualitative research methods. The first was a qualitative explorative study of nurses’ perceptions of the issues and challenges associated with nurse-administered PSA in the CCL. Major themes emerged from analysis of the qualitative data regarding the lack of access to anaesthetists, the limitations of sedative medications, the barriers to effective patient monitoring and the impact that the increasing complexity of procedures has on patients' sedation requirements. The second study in Phase Two was a cross-sectional survey of nurse-administered PSA practice in Australian and New Zealand CCLs. This was the first study to quantify the frequency that nurse-administered PSA was used in the CCL setting and to characterise associated nursing practices. It was found that nearly all CCLs utilise nurse-administered PSA (94%). Of note, by characterising nurse-administered PSA in Australian and New Zealand CCLs, several strategies to improve practice, such as setting up protocols for patient monitoring and establishing comprehensive PSA education for CCL nurses, were identified. The third study in Phase Two was a matched case-control study of risk factors for impaired respiratory function during nurse-administered PSA in the CCL setting. Patients with acute illness were found to be nearly twice as likely to experience impaired respiratory function during nurse-administered PSA (OR=1.78; 95%CI=1.19-2.67; p=0.005). These significant findings can now be used to inform prospective studies investigating the effectiveness of interventions for impaired respiratory function during nurse-administered PSA in the CCL. The objective of the third and final phase of the program of research was to develop recommendations for practice. To achieve this objective, a synthesis of findings from the previous phases of the program of research informed a modified Delphi study, which was conducted to develop a set of clinical practice guidelines for nurse-administered PSA in the CCL. The clinical practice guidelines that were developed set current best practice standards for pre-procedural patient assessment and risk screening practices as well as the intra and post-procedural patient monitoring practices that nurses who administer PSA in the CCL should undertake in order to deliver safe, evidence-based and consistent care to the many patients who undergo procedures in this setting. In summary, the mixed methods approach that was used clearly enabled the research objectives to be comprehensively addressed in an informed sequential manner, and, as a consequence, this thesis has generated a substantial amount of new knowledge to inform and support nurse-led sedation practice in the CCL context. However, a limitation of the research to note is that the comprehensive appraisal of the evidence conducted, combined with the guideline development process, highlighted that there were numerous deficiencies in the evidence base. As such, rather than being based on high-level evidence, many of the recommendations for practice were produced by consensus. For this reason, further research is required in order to ascertain which specific practices result in the most optimal patient and health service outcomes. Therefore, along with necessary guideline implementation and evaluation projects, post-doctoral research is planned to follow up on the research gaps identified, which are planned to form part of a continuing program of research in this field.
Resumo:
Aims and objectives. To examine Chinese cancer patients’ fatigue self-management, including the types of self-management behaviours used, their confidence in using these behaviours, the degree of relief obtained and the factors associated with patients’ use of fatigue self-management behaviours. Background. Fatigue places significant burden on patients with cancer undergoing chemotherapy. While some studies have explored fatigue self-management in Western settings, very few studies have explored self-management behaviours in China. Design. Cross-sectional self- and/or interviewer-administered survey. Methods. A total of 271 participants with self-reported fatigue in the past week were recruited from a specialist cancer hospital in south-east China. Participants completed measures assessing the use of fatigue self-management behaviours, corresponding self-efficacy, perceived relief levels plus items assessing demographic characteristics, fatigue experiences, distress and social support. Results. A mean of 4_94 (_2_07; range 1–10) fatigue self-management behaviours was reported. Most behaviours were rated as providing moderate relief and were implemented with moderate self-efficacy. Regression analyses identified that having more support from one’s neighbourhood and better functional status predicted the use of a greater number of self-management behaviours. Separate regression analyses identified that greater neighbourhood support predicted greater relief from ‘activity enhancement behaviours’ and that better functional status predicted greater relief from ‘rest and sleep behaviours’. Higher self-efficacy scores predicted greater relief from corresponding behaviours. Conclusions. A range of fatigue self-management behaviours were initiated by Chinese patients with cancer. Individual, condition and environmental factors were found to influence engagement in and relief from fatigue self-managementbehaviours. Relevance to clinical practice. Findings highlight the need for nurses to explore patients’ use of fatigue self-management behaviours and the effectiveness of these behaviours in reducing fatigue. Interventions that improve patients’ self-efficacy and neighbourhood supports have the potential to improve outcomes from fatigue self-management behaviours.