991 resultados para NEUTROPENIC PATIENTS
Resumo:
Despite the Revised International Prognostic Index's (R-IPI) undoubted utility in diffuse large B-cell lymphoma (DLBCL), significant clinical heterogeneity within R-IPI categories persists. Emerging evidence indicates that circulating host immunity is a robust and R-IPI independent prognosticator, most likely reflecting the immune status of the intratumoral microenvironment. We hypothesized that direct quantification of immunity within lymphomatous tissue would better permit stratification within R-IPI categories. We analyzed 122 newly diagnosed consecutive DLBCL patients treated with rituximab, cyclophosphamide, doxorubicin, vincristine, and prednisone (R-CHOP) chemo-immunotherapy. Median follow-up was 4 years. As expected, the R-IPI was a significant predictor of outcome with 5-year overall survival (OS) 87% for very good, 87% for good, and 51% for poor-risk R-IPI scores (P < 0.001). Consistent with previous reports, systemic immunity also predicted outcome (86% OS for high lymphocyte to monocyte ratio [LMR], versus 63% with low LMR, P = 0.01). Multivariate analysis confirmed LMR as independently prognostic. Flow cytometry on fresh diagnostic lymphoma tissue, identified CD4+ T-cell infiltration as the most significant predictor of outcome with ≥23% infiltration dividing the cohort into high and low risk groups with regard to event-free survival (EFS, P = 0.007) and OS (P = 0.003). EFS and OS were independent of the R-IPI and LMR. Importantly, within very good/good R-IPI patients, CD4+ T-cells still distinguished patients with different 5 year OS (high 96% versus low 63%, P = 0.02). These results illustrate the importance of circulating and local intratumoral immunity in DLBCL treated with R-CHOP.
Resumo:
Background: Patients with Crohn’s disease (CD) often require surgery at some stage of disease course. Prediction of CD outcome is influenced by clinical, environmental, serological, and genetic factors (eg, NOD2). Being able to identify CD patients at high risk of surgical intervention should assist clinicians to decide whether or not to prescribe early aggressive treatment with immunomodulators. Methods: We performed a retrospective analysis of selected clinical (age at diagnosis, perianal disease, active smoking) and genetic (NOD2 genotype) data obtained for a population-based CD cohort from the Canterbury Inflammatory Bowel Disease study. Logistic regression was used to identify predictors of complicated outcome in these CD patients (ie, need for inflammatory bowel disease-related surgery). Results: Perianal disease and the NOD2 genotype were the only independent factors associated with the need for surgery in this patient group (odds ratio=2.84 and 1.60, respectively). By combining the associated NOD2 genotype with perianal disease we generated a single “clinicogenetic” variable. This was strongly associated with increased risk of surgery (odds ratio=3.84, P=0.00, confidence interval, 2.28-6.46) and offered moderate predictive accuracy (positive predictive value=0.62). Approximately 1/3 of surgical outcomes in this population are attributable to the NOD2+PA variable (attributable risk=0.32). Conclusions: Knowledge of perianal disease and NOD2 genotype in patients presenting with CD may offer clinicians some decision-making utility for early diagnosis of complicated CD progression and initiating intensive treatment to avoid surgical intervention. Future studies should investigate combination effects of other genetic, clinical, and environmental factors when attempting to identify predictors of complicated CD outcomes.
Resumo:
Aim: To show the validity and reliability of the translated Hill-Bone scale on 110 hypertensive participants from an Arabic speaking country. Background: With the wide spread availability of treatment, individuals with hypertension have reported various levels of adherence to their medications. Flexible and practical methods of measuring adherence are the use of surveys, scales and interviews. There is a scarcity in Arabic tools and scales that measure levels of adherence to antihypertensive treatments in the Arabic speaking context. Design and Methods: A cross-sectional study was conducted among 110 individuals diagnosed with hypertension and from an Arabic speaking country. The Hill-Bone scale includes three subscales that measure salt intake, medication adherence and appointment keeping. Given the focus on the pharmacological management of hypertensive patients, only items related to medication adherence and appointment keeping subscales were used. The scale was translated by following a comprehensive and accepted method of translation. Results: Instrument reliability was tested by identifying the Cronbach’s alpha coefficient. The subscale for medication adherence in the Hill-Bone scale reported an acceptable level of reliability (Cronbach’s alpha =0.76). Compared with other translated versions of the Hill-Bone scale, the scale also reported good reliability and validity. Conclusion: Results indicate that the Arabic translated version of the Hill-Bone scale has an acceptable level of reliability and validity and therefore can be used in Arabic speaking populations.
Resumo:
Migraine is a painful and debilitating, neurovascular disease. Current migraine head pain treatments work with differing efficacies in migraineurs. The opioid system plays an important role in diverse biological functions including analgesia, drug response and pain reduction. The A118G single nucleotide polymorphism (SNP) in exon 1 of the μ-opioid receptor gene (OPRM1) has been associated with elevated pain responses and decreased pain threshold in a variety of populations. The aim of the current preliminary study was to test whether genotypes of the OPRM1 A118G SNP are associated with head pain severity in a clinical cohort of female migraineurs. This was a preliminary study to determine whether genotypes of the OPRM1 A118G SNP are associated with head pain severity in a clinical cohort of female migraineurs. A total of 153 chronic migraine with aura sufferers were assessed for migraine head pain using the Migraine Disability Assessment Score instrument and classified into high and low pain severity groups. DNA was extracted and genotypes obtained for the A118G SNP. Logistic regression analysis adjusting for age effects showed the A118G SNP of the OPRM1 gene to be significantly associated with migraine pain severity in the test population (P = 0.0037). In particular, G118 allele carriers were more likely to be high pain sufferers compared to homozygous carriers of the A118 allele (OR = 3.125, 95 % CI = 1.41, 6.93, P = 0.0037). These findings suggest that A118G genotypes of the OPRM1 gene may influence migraine-associated head pain in females. Further investigations are required to fully understand the effect of this gene variant on migraine head pain including studies in males and in different migraine subtypes, as well as in response to head pain medication.
Resumo:
Recent association studies in multiple sclerosis (MS) have identified and replicated several single nucleotide polymorphism (SNP) susceptibility loci including CLEC16A, IL2RA, IL7R, RPL5, CD58, CD40 and chromosome 12q13–14 in addition to the well established allele HLA-DR15. There is potential that these genetic susceptibility factors could also modulate MS disease severity, as demonstrated previously for the MS risk allele HLA-DR15. We investigated this hypothesis in a cohort of 1006 well characterised MS patients from South-Eastern Australia. We tested the MS-associated SNPs for association with five measures of disease severity incorporating disability, age of onset, cognition and brain atrophy. We observed trends towards association between the RPL5 risk SNP and time between first demyelinating event and relapse, and between the CD40 risk SNP and symbol digit test score. No associations were significant after correction for multiple testing. We found no evidence for the hypothesis that these new MS disease risk-associated SNPs influence disease severity.
Resumo:
Background Health-related quality of life (HRQoL) is an important outcome for patients diagnosed with coronary heart disease. This report describes predictors of physical and mental HRQoL at six months post-hospitalisation for myocardial infarction. Methods Participants were myocardial infarction patients (n=430) admitted to two tertiary referral centres in Brisbane, Australia who completed a six month coronary heart disease secondary prevention trial (ProActive Heart). Outcome variables were HRQoL (Short Form-36) at six months, including a physical and mental summary score. Baseline predictors included demographics and clinical variables, health behaviours, and psychosocial variables. Stepwise forward multiple linear regression analyses were used to identify significant independent predictors of six month HRQoL. Results Physical HRQoL was lower in participants who: were older (p<0.001); were unemployed (p=0.03); had lower baseline physical and mental HRQoL scores (p<0.001); had lower confidence levels in meeting sufficient physical activity recommendations (p<0.001); had no intention to be physically active in the next six months (p<0.001); and were more sedentary (p=0.001). Mental HRQoL was lower in participants who: were younger (p=0.01); had lower baseline mental HRQoL (p<0.001); were more sedentary (p=0.01) were depressed (p<0.001); and had lower social support (p=0.001). Conclusions This study has clinical implications as identification of indicators of lower physical and mental HRQoL outcomes for myocardial infarction patients allows for targeted counselling or coronary heart disease secondary prevention efforts. Trial registration Australian Clinical Trials Registry, Australian New Zealand Clinical Trials Registry, CTRN12607000595415. Keywords: Myocardial infarction; Secondary prevention; Cardiac rehabilitation; Telephone-delivered; Health-related quality of life; Health coaching; Tele-health
Resumo:
Background Many Australian cities experience large winter increases in deaths and hospitalisations. Flu outbreaks are only part of the problem and inadequate protection from cold weather is a key independent risk factor. Better home insulation has been shown to improve health during winter, but no study has examined whether better personal insulation improves health. Data and Methods We ran a randomised controlled trial of thermal clothing versus usual care. Subjects with heart failure (a group vulnerable to cold) were recruited from a public hospital in Brisbane in winter and followed-up at the end of winter. Those randomised to the intervention received two thermal hats and tops and a digital thermometer. The primary outcome was the number of days in hospital, with secondary outcomes of General Practitioner (GP) visits and self-rated health. Results The mean number of days in hospital per 100 winter days was 2.5 in the intervention group and 1.8 in the usual care group, with a mean difference of 0.7 (95% CI: –1.5, 5.4). The intervention group had 0.2 fewer GP visits on average (95% CI: –0.8, 0.3), and a higher self-rated health, mean improvement –0.3 (95% CI: –0.9, 0.3). The thermal tops were generally well used, but even in cold temperatures the hats were only worn by 30% of subjects. Conclusions Thermal clothes are a cheap and simple intervention, but further work needs to be done on increasing compliance and confirming the health and economic benefits of providing thermals to at-risk groups.
Resumo:
The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients in Australian and New Zealand hospitals consume ≤50% of the offered food. The ANCDS found a significant association between poor food intake and increased in-hospital mortality after controlling for confounders (nutritional status, age, disease type and severity)1. Evidence for the effectiveness of medical nutrition therapy (MNT) in hospital patients eating poorly is lacking. An exploratory study was conducted in respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, 24-hour food intake (0%, 25%, 50%, 75%, 100% of offered meals) was evaluated for patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT with food intake re-evaluated on day-7. 184 patients were observed over four weeks. Sixty-two patients (34%) consumed ≤50% of the offered meals. Simple interventions (feeding/menu assistance, diet texture modifications) improved intake to ≥75% in 30 patients who did not require further MNT. Of the 32 patients referred for MNT, baseline and day-7 data were available for 20 patients (68±17years, 65% females, BMI: 22±5kg/m2, median energy, protein intake: 2250kJ, 25g respectively). On day-7, 17 participants (85%) demonstrated significantly higher consumption (4300kJ, 53g; p<0.01). Three participants demonstrated no improvement due to ongoing nutrition-impact symptoms. “Percentage food intake” was a quick tool to identify patients in whom simple interventions could enhance intake. MNT was associated with improved dietary intake in hospital patients. Further research is needed to establish a causal relationship.
Resumo:
This paper reports on mixed method empirical research undertaken with individuals who have completed advance health directives (‘principals’) and doctors who have either attested to the principal’s capacity when the document was completed or been called upon to use these documents in clinical settings. Principals and doctors appear to have different understandings of the purpose of these documents and their role in decision-making about medical treatment. We recommend changes to the advance health directive form in Queensland to promote informed decision-making which will help to better align perceptions of principals and doctors about the role of these documents.
Resumo:
Purpose Obstructive sleep apnoea (OSA) patients effectively treated by and compliant with continuous positive air pressure (CPAP) occasionally miss a night’s treatment. The purpose of this study was to use a real car interactive driving simulator to assess the effects of such an occurrence on the next day’s driving, including the extent to which these drivers are aware of increased sleepiness. Methods Eleven long-term compliant CPAP-treated 50–75-year-old male OSA participants completed a 2-h afternoon, simulated, realistic monotonous drive in an instrumented car, twice, following one night: (1) normal sleep with CPAP and (2) nil CPAP. Drifting out of road lane (‘incidents’), subjective sleepiness every 200 s and continuous electroencephalogram (EEG) activities indicative of sleepiness and compensatory effort were monitored. Results Withdrawal of CPAP markedly increased sleep disturbance and led to significantly more incidents, a shorter ‘safe’ driving duration, increased alpha and theta EEG power and greater subjective sleepiness. However, increased EEG beta activity indicated that more compensatory effort was being applied. Importantly, under both conditions, there was a highly significant correlation between subjective and EEG measures of sleepiness, to the extent that participants were well aware of the effects of nil CPAP. Conclusions Patients should be aware that compliance with treatment every night is crucial for safe driving.
Resumo:
Introduction Patients with virally mediated head and neck cancer (VMHNC) often present with advanced nodal disease that is highly radioresponsive as demonstrated by tumour and nodal regression during treatment. The resultant changes may impact on the planned dose distribution and so adversely affect the therapeutic ratio. The aim of this study was to evaluate the dosimetric effect of treatment-induced anatomical changes in VMHNC patients who had undergone a re-plan. Methods Thirteen patients with virally mediated oropharyngeal or nasopharyngeal cancer who presented for definitive radiotherapy between 2005 and 2010 and who had a re-plan generated were investigated. The dosimetric effect of anatomical changes, was quantified by comparing dose volume histograms (DVH) of primary and nodal gross target volumes and organs at risk (OAR), including spinal cord and parotid glands, from the original plan and a comparison plan. Results Eleven 3DCRT and 2 IMRT plans were evaluated. Dose to the spinal cord and brainstem increased by 4.1% and 2.6%, respectively. Mean dose to the parotid glands also increased by 3.5%. In contrast, the dose received by 98% of the primary and nodal gross tumour volumes decreased by 0.15% and 0.3%, respectively when comparing the initial treatment plan to the comparison plan. Conclusion In this study, treatment-induced anatomical changes had the greatest impact on OAR dose with negligible effect on the dose to nodal gross tumour volumes. In the era of intensity modulated radiotherapy (IMRT), accounting for treatment-induced anatomical changes is important as focus is placed on minimising the acute and long-term side effects of treatment.
Resumo:
Results of recent studies suggest that circulating levels of vitamin D may play an important role in cancer-specific outcomes. The present systematic review was undertaken to determine the prevalence of vitamin D deficiency (<25 nmol/L) and insufficiency (25-50 nmol/L) in cancer patients and to evaluate the association between circulating calcidiol (the indicator of vitamin D status) and clinical outcomes. A systematic search of original, peer-reviewed studies on calcidiol at cancer diagnosis, and throughout treatment and survival, was conducted yielding 4,706 studies. A total of 37 studies met the inclusion criteria for this review. Reported mean blood calcidiol levels ranged from 24.7 to 87.4 nmol/L, with up to 31% of patients identified as deficient and 67% as insufficient. The efficacy of cholecalciferol supplementation for raising the concentration of circulating calcidiol is unclear; standard supplement regimens of <1,000 IU D3 /day may not be sufficient to maintain adequate concentrations or prevent decreasing calcidiol. Dose-response studies linking vitamin D status to musculoskeletal and survival outcomes in cancer patients are lacking.
Resumo:
Background and aims The Australasian Nutrition Care Day Survey (ANCDS) reported two-in-five patients consume ≤50% of the offered food in Australian and New Zealand hospitals. After controlling for confounders (nutritional status, age, disease type and severity), the ANCDS also established an independent association between poor food intake and increased in-hospital mortality. This study aimed to evaluate if medical nutrition therapy (MNT) could improve dietary intake in hospital patients eating poorly. Methods An exploratory pilot study was conducted in the respiratory, neurology and orthopaedic wards of an Australian hospital. At baseline, percentage food intake (0%, 25%, 50%, 75%, and 100%) was evaluated for each main meal and snack for a 24-hour period in patients hospitalised for ≥2 days and not under dietetic review. Patients consuming ≤50% of offered meals due to nutrition-impact symptoms were referred to ward dietitians for MNT. Food intake was re-evaluated on the seventh day following recruitment (post-MNT). Results 184 patients were observed over four weeks; 32 patients were referred for MNT. Although baseline and post-MNT data for 20 participants (68±17years, 65% females) indicated a significant increase in median energy and protein intake post-MNT (3600kJ/day, 40g/day) versus baseline (2250kJ/day, 25g/day) (p<0.05), the increased intake met only 50% of dietary requirements. Persistent nutrition impact symptoms affected intake. Conclusion In this pilot study whilst dietary intake improved, it remained inadequate to meet participants’ estimated requirements due to ongoing nutrition-impact symptoms. Appropriate medical management and early enteral feeding could be a possible solution for such patients.