117 resultados para Burns and scalds -- Patients -- Rehabilitation. Burns and scalds in children.
Resumo:
Families of 52 first-admission patients diagnosed with a severe psychiatric disorder were videotaped interacting with the patient. Behavioral coding was used to derive several indices of interaction: base rates of positive and negative behavior by patients and relatives, cumulative affect of patients and relatives (the difference between the rates of positive and negative behaviors), and classification of families as affect-regulated or unregulated. Family-affect regulation reflects positive cumulative affect by both people in a given interaction. Six months after hospital discharge patients were assessed on occurrence of relapse, global functioning, severity of psychiatric symptoms, and quality of life. Relative to affect-unregulated family interaction, affect-regulated interaction predicted significantly fewer relapses, better global functioning, fewer positive and negative psychiatric symptoms, and higher patient quality of life. Most of the predictions by family-affect regulation were independent of
Resumo:
Rationale: The Australasian Nutrition Care Day Survey (ANCDS) evaluated if malnutrition and decreased food intake are independent risk factors for negative outcomes in hospitalised patients. Methods: A multicentre (56 hospitals) cross-sectional survey was conducted in two phases. Phase 1 evaluated nutritional status (defined by Subjective Global Assessment) and 24-hour food intake recorded as 0, 25, 50, 75, and 100% intake. Phase 2 data, which included length of stay (LOS), readmissions and mortality, were collected 90 days post-Phase 1. Logistic regression was used to control for confounders: age, gender, disease type and severity (using Patient Clinical Complexity Level scores). Results: Of 3122 participants (53% males, mean age: 65±18 years) 32% were malnourished and 23% consumed�25% of the offered food. Median LOS for malnourished (MN) patients was higher than well-nourished (WN) patients (15 vs. 10 days, p<0.0001). Median LOS for patients consuming �25% of the food was higher than those consuming �50% (13 vs. 11 days, p<0.0001). MN patients had higher readmission rates (36% vs. 30%, p = 0.001). The odds ratios of 90-day in-hospital mortality were 1.8 times greater for MN patients (CI: 1.03 3.22, p = 0.04) and 2.7 times greater for those consuming �25% of the offered food (CI: 1.54 4.68, p = 0.001). Conclusion: The ANCDS demonstrates that malnutrition and/or decreased food intake are associated with longer LOS and readmissions. The survey also establishes that malnutrition and decreased food intake are independent risk factors for in-hospital mortality in acute care patients; and highlights the need for appropriate nutritional screening and support during hospitalisation. Disclosure of Interest: None Declared.
Resumo:
This paper reports on mixed method empirical research undertaken with individuals who have completed advance health directives (‘principals’) and doctors who have either attested to the principal’s capacity when the document was completed or been called upon to use these documents in clinical settings. Principals and doctors appear to have different understandings of the purpose of these documents and their role in decision-making about medical treatment. We recommend changes to the advance health directive form in Queensland to promote informed decision-making which will help to better align perceptions of principals and doctors about the role of these documents.
Resumo:
Results of recent studies suggest that circulating levels of vitamin D may play an important role in cancer-specific outcomes. The present systematic review was undertaken to determine the prevalence of vitamin D deficiency (<25 nmol/L) and insufficiency (25-50 nmol/L) in cancer patients and to evaluate the association between circulating calcidiol (the indicator of vitamin D status) and clinical outcomes. A systematic search of original, peer-reviewed studies on calcidiol at cancer diagnosis, and throughout treatment and survival, was conducted yielding 4,706 studies. A total of 37 studies met the inclusion criteria for this review. Reported mean blood calcidiol levels ranged from 24.7 to 87.4 nmol/L, with up to 31% of patients identified as deficient and 67% as insufficient. The efficacy of cholecalciferol supplementation for raising the concentration of circulating calcidiol is unclear; standard supplement regimens of <1,000 IU D3 /day may not be sufficient to maintain adequate concentrations or prevent decreasing calcidiol. Dose-response studies linking vitamin D status to musculoskeletal and survival outcomes in cancer patients are lacking.
Resumo:
BACKGROUND. The authors compared gemcitabine and carboplatin (GC) with mitomycin, ifosfamide, and cisplatin (MIC) or mitomycin, vinblastine, and cisplatin (MVP) in patients with advanced nonsmall cell lung carcinoma (NSCLC). The primary objective was survival. Secondary objectives were time to disease progression, response rates, evaluation of toxicity, disease-related symptoms, World Health Organization performance status (PS), and quality of life (QoL). METHODS. Three hundred seventy-two chemotherapy-naïve patients with International Staging System Stage III/IV NSCLC who were ineligible for curative radiotherapy or surgery were randomized to receive either 4 cycles of gemcitabine (1000 mg/m2 on Days 1, 8, and 15) plus carboplatin (area under the serum concentration-time curve, 5; given on Day 1) every 4 weeks (the GC arm) or MIC/MVP every 3 weeks (the MIC/MVP arm). RESULTS. There was no significant difference in median survival (248 days in the MIC/MVP arm vs. 236 days in the GC arm) or time to progression (225 days in the MIC/MVP arm vs. 218 days in the GC arm) between the 2 treatment arms. The 2-year survival rate was 11.8% in the MIC/MVP arm and 6.9% in the GC arm. The 1-year survival rate was 32.5% in the MIC/MVP arm and 33.2% in the GC arm. In the MIC/MVP arm, 33% of patients responded (4 complete responses [CRs] and 57 partial responses [PRs]) whereas in the GC arm, 30% of patients responded (3 CRs and 54 PRs). Nonhematologic toxicity was comparable for patients with Grade 3-4 symptoms, except there was more alopecia among patients in the MIC/MVP arm. GC appeared to produce more hematologic toxicity and necessitated more transfusions. There was no difference in performance status, disease-related symptoms, of QoL between patients in the two treatment arms. Fewer inpatient stays for complications were required with GC. CONCLUSIONS. The results of the current study failed to demonstrate any difference in efficacy between the newer regimen of GC and the older regimens of MIC and MVP. © 2003 American Cancer Society.
Resumo:
Background: The randomised phase 3 First-Line Erbitux in Lung Cancer (FLEX) study showed that the addition of cetuximab to cisplatin and vinorelbine significantly improved overall survival compared with chemotherapy alone in the first-line treatment of advanced non-small-cell lung cancer (NSCLC). The main cetuximab-related side-effect was acne-like rash. Here, we assessed the association of this acne-like rash with clinical benefit. Methods: We did a subgroup analysis of patients in the FLEX study, which enrolled patients with advanced NSCLC whose tumours expressed epidermal growth factor receptor. Our landmark analysis assessed if the development of acne-like rash in the first 21 days of treatment (first-cycle rash) was associated with clinical outcome, on the basis of patients in the intention-to-treat population alive on day 21. The FLEX study is registered with ClinicalTrials.gov, number NCT00148798. Findings: 518 patients in the chemotherapy plus cetuximab group-290 of whom had first-cycle rash-and 540 patients in the chemotherapy alone group were alive on day 21. Patients in the chemotherapy plus cetuximab group with first-cycle rash had significantly prolonged overall survival compared with patients in the same treatment group without first-cycle rash (median 15·0 months [95% CI 12·8-16·4] vs 8·8 months [7·6-11·1]; hazard ratio [HR] 0·631 [0·515-0·774]; p<0·0001). Corresponding significant associations were also noted for progression-free survival (median 5·4 months [5·2-5·7] vs 4·3 months [4·1-5·3]; HR 0·741 [0·607-0·905]; p=0·0031) and response (rate 44·8% [39·0-50·8] vs 32·0% [26·0-38·5]; odds ratio 1·703 [1·186-2·448]; p=0·0039). Overall survival for patients without first-cycle rash was similar to that of patients that received chemotherapy alone (median 8·8 months [7·6-11·1] vs 10·3 months [9·6-11·3]; HR 1·085 [0·910-1·293]; p=0·36). The significant overall survival benefit for patients with first-cycle rash versus without was seen in all histology subgroups: adenocarcinoma (median 16·9 months, [14·1-20·6] vs 9·3 months [7·7-13·2]; HR 0·614 [0·453-0·832]; p=0·0015), squamous-cell carcinoma (median 13·2 months [10·6-16·0] vs 8·1 months [6·7-12·6]; HR 0·659 [0·472-0·921]; p=0·014), and carcinomas of other histology (median 12·6 months [9·2-16·4] vs 6·9 months [5·2-11·0]; HR 0·616 [0·392-0·966]; p=0·033). Interpretation: First-cycle rash was associated with a better outcome in patients with advanced NSCLC who received cisplatin and vinorelbine plus cetuximab as a first-line treatment. First-cycle rash might be a surrogate clinical marker that could be used to tailor cetuximab treatment for advanced NSCLC to those patients who would be most likely to derive a significant benefit. Funding: Merck KGaA. © 2011 Elsevier Ltd.
Resumo:
Aims: To report cancer-specific and health-related quality-of-life outcomes in patients undergoing radical chemoradiation (CRT) alone for oesophageal cancer. Materials and methods: Between 1998 and 2005, 56 patients with oesophageal cancer received definitive radical CRT, due to local disease extent, poor general health, or patient choice. Data from European Organization for Research and Treatment of Cancer quality-of-life questionnaires QLQ-30 and QLQ-OES24 were collected prospectively. Questionnaires were completed at diagnosis, and at 3, 6 and 12 months after CRT where applicable. Results: The median follow-up was 18 months. The median overall survival was 14 months, with a 51, 26 and 13% 1-, 3- and 5-year survival, respectively. At 12 months after treatment there was a significant improvement compared with before treatment with respect to dysphagia and pain. Global health scores were not significantly affected. Conclusions: Considering the relatively short long-term survival for this cohort of patients, maximising the quality of those final months should be very carefully borne in mind from the outset. The health-related quality-of-life data reported herein helps to establish benchmarks for larger evaluation within randomised clinical trials. © 2007 The Royal College of Radiologists.
Resumo:
INTRODUCTION: Performance status (PS) 2 patients with non-small cell lung cancer (NSCLC) experience more toxicity, lower response rates, and shorter survival times than healthier patients treated with standard chemotherapy. Paclitaxel poliglumex (PPX), a macromolecule drug conjugate of paclitaxel and polyglutamic acid, reduces systemic exposure to peak concentrations of free paclitaxel and may lead to increased concentrations in tumors due to enhanced vascular permeability. METHODS: Chemotherapy-naive PS 2 patients with advanced NSCLC were randomized to receive carboplatin (area under the curve = 6) and either PPX (210 mg/m/10 min without routine steroid premedication) or paclitaxel (225 mg/m/3 h with standard premedication) every 3 weeks. The primary end point was overall survival. RESULTS: A total of 400 patients were enrolled. Alopecia, arthralgias/myalgias, and cardiac events were significantly less frequent with PPX/carboplatin, whereas grade ≥3 neutropenia and grade 3 neuropathy showed a trend of worsening. There was no significant difference in the incidence of hypersensitivity reactions despite the absence of routine premedication in the PPX arm. Overall survival was similar between treatment arms (hazard ratio, 0.97; log rank p = 0.769). Median and 1-year survival rates were 7.9 months and 31%, for PPX versus 8 months and 31% for paclitaxel. Disease control rates were 64% and 69% for PPX and paclitaxel, respectively. Time to progression was similar: 3.9 months for PPX/carboplatin versus 4.6 months for paclitaxel/carboplatin (p = 0.210). CONCLUSION: PPX/carboplatin failed to provide superior survival compared with paclitaxel/carboplatin in the first-line treatment of PS 2 patients with NSCLC, but the results with respect to progression-free survival and overall survival were comparable and the PPX regimen was more convenient. © 2008International Association for the Study of Lung Cancer.
Resumo:
Purpose: In non-small-cell lung cancer (NSCLC), the epidermal growth factor receptor (EGFR) and cyclooxygenase-2 (COX-2) play major roles in tumorigenesis. This phase I/II study evaluated combined therapy with the EGFR tyrosine kinase inhibitor (TKI) gefitinib and the COX-2 inhibitor rofecoxib in platinum-pretreated, relapsed, metastatic NSCLC (n = 45). Patients and Methods: Gefitinib 250 mg/d was combined with rofecoxib (dose escalated from 12.5 to 25 to 50 mg/d through three cohorts, each n = 6). Because the rofecoxib maximum-tolerated dose was not reached, the 50 mg/d cohort was expanded for efficacy evaluation (n = 33). Results: Among the 42 assessable patients, there was one complete response (CR) and two partial responses (PRs) and 12 patients with stable disease (SD); disease control rate was 35.7% (95% CI, 21.6% to 52.0%). Median time to tumor progression was 55 days (95% CI, 47 to 70 days), and median survival was 144 days (95% CI, 103 to 190 days). In a pilot study, matrix-assisted laser desorption/ionization (MALDI) proteomics analysis of baseline serum samples could distinguish patients with an objective response from those with SD or progressive disease (PD), and those with disease control (CR, PR, and SD) from those with PD. The regimen was generally well tolerated, with predictable toxicities including skin rash and diarrhea. Conclusion: Gefitinib combined with rofecoxib provided disease control equivalent to that expected with single-agent gefitinib and was generally well tolerated. Baseline serum proteomics may help identify those patients most likely to benefit from EGFR TKIs. © 2007 by American Society of Clinical Oncology.
Resumo:
"The extended drought periods in each degradation episode have provided a test of the capacity of grazing systems (i.e. land, plants, animals, humans and social structure) to handle stress. Evidence that degradation was already occurring was identified prior to the extended drought sequences. The sequence of dry years, ranging from two to eight years, exposed and/or amplified the degradation processes. The unequivocal evidence was provided by: (a) the physical 'horror' of bare landscapes, erosion scalds and gullies and dust storms; (b) the biological devastation of woody weeds and animal suffering/deaths or forced sales, and; (c) the financial and emotional plight of graziers and their families due to reduced production in some cases leading to abandonment of properties or, sadly, deaths (e.g. McDonald 1991, Ker Conway 1989)."--Publisher website
Resumo:
Osteogenesis imperfecta (OI) is a heritable disease occurring in one out of every 20,000 births. Although it is known that Type I collagen mutation in OI leads to increased bone fragility, the mechanism of this increased susceptibility to fracture is not clear. The aim of this study was to assess the microstructure of cortical bone fragments from patients with osteogenesis imperfecta (OI) using polarized light microscopy, and to correlate microstructural observations with the results of previously performed mechanical compression tests on bone from the same source. Specimens of cortical bone were harvested from the lower limbs of three (3) OI patients at the time of surgery, and were divided into two groups. Group 1 had been subjected to previous micro-mechanical compression testing, while Group 2 had not been subjected to any prior testing. Polarized light microscopy revealed disorganized bone collagen architecture as has been previously observed, as well as a large increase in the areal porosity of the bone compared to typical values for healthy cortical bone, with large (several hundred micron sized), asymmetrical pores. Importantly, the areal porosity of the OI bone samples in Group 1 appears to correlate strongly with their previously measured apparent Young's modulus and compressive strength. Taken together with prior nanoindentation studies on OI bone tissue, the results of this study suggest that increased intra-cortical porosity is responsible for the reduction in macroscopic mechanical properties of OI cortical bone, and therefore that in vivo imaging modalities with resolutions of ~ 100 μm or less could potentially be used to non-invasively assess bone strength in OI patients. Although the number of subjects in this study is small, these results highlight the importance of further studies in OI bone by groups with access to human OI tissue in order to clarify the relationship between increased porosity and reduced macroscopic mechanical integrity.
Resumo:
We read with great interest the paper by Laivoranta-Nyman et al.1 They studied Finnish patients with rheumatoid arthritis (RA) and controls, and suggested that there exist susceptibility, neutral and protective HLA-DR-DQ haplotypes that do not match the predictions of current hypotheses for the mechanism of association of the HLA class II region and RA...
Resumo:
PURPOSE: This study investigated the significance of baseline cortisol levels and adrenal response to corticotropin in shocked patients after acute myocardial infarction (AMI). METHODS: A short corticotropin stimulation test was performed in 35 patients with cardiogenic shock after AMI by intravenously injecting of 250 μg of tetracosactrin (Synacthen). Blood samples were obtained at baseline (T0) before and at 30 (T30) and 60 (T60) minutes after the test to determine plasma total cortisol (TC) and free cortisol concentrations. The main outcome measure was in-hospital mortality and its association with T0 TC and maximum response to corticotropin (maximum difference [Δ max] in cortisol levels between T0 and the highest value between T30 and T60). RESULTS: The in-hospital mortality was 37%, and the median time to death was 4 days (interquartile range, 3-9 days). There was some evidence of an increased mortality in patients with T0 TC concentrations greater than 34 μg/dL (P=.07). Maximum difference by itself was not an independent predictor of death. Patients with a T0 TC 34 μg/dL or less and Δ max greater than 9 μg/dL appeared to have the most favorable survival (91%) when compared with the other 2 groups: T0 34 μg/dL or less and Δ max 9 μg/dL or less or T0 34 μg/dL or higher and Δ max greater than 9 μg/dL (75%; P=.8) and T0 greater than 34 μg/dL and Δ max 9 μg/dL or less (60%; P=.02). Corticosteroid therapy was associated with an increased mortality (P=.03). There was a strong correlation between plasma TC and free cortisol (r=0.85). CONCLUSIONS: A high baseline plasma TC was associated with a trend toward increased mortality in patients with cardiogenic shock post-AMI. Patients with lower baseline TC, but with an inducible adrenal response, appeared to have a survival benefit. A prognostic system based on basal TC and Δ max similar to that described in septic shock appears feasible in this cohort. Corticosteroid therapy was associated with adverse outcomes. These findings require further validation in larger studies.
Resumo:
Introduction Hospitalisation for percutaneous coronary intervention (PCI) is often short, with limited nurse-teaching time and poor information absorption. Currently, patients are discharged home only to wait up to 4-8 weeks to commence a secondary prevention program and visit their cardiologist. This wait is an anxious time for patients and confidence or self-efficacy (SE) to self-manage may be low. Objectives To determine the effects of a nurse-led, educational intervention on participant SE and anxiety in the early post-discharge period. Methods A pilot study was undertaken as a randomised controlled clinical trial. Thirty-three participants were recruited, with n=13 randomised to the intervention group. A face-to-face, nurse-led, educational intervention was undertaken within the first 5-7 days post-discharge. Intervention group participants received standard post-discharge education, physical assessment, with a strong focus on the emotional impact of cardiovascular events and PCI. Early reiteration of post-discharge education was offered, along with health professional support with the aim to increase patients’ SE and to effectively manage their post-discharge health and well being, as well as anxieties. Self-efficacy to return to normal activities was measured to gauge participants’ abilities to manage post-PCI after attending the intervention using the cardiac self-efficacy (CSE) scale. State and trait anxiety was also measured using the State-Trait Anxiety Inventory (STAI) to determine if an increase in SE would influence participant anxiety. Results There were some increases in mean CSE scores in the intervention group participants over time. Areas of increase included return to normal social activities and confidence to change diet. Although reductions were observed in mean state and trait anxiety scores in both groups, an overall larger reduction in intervention group participants was observed over time. Conclusion It is essential that patients are given the education, support, and skills to self-manage in the early post-discharge period so that they have greater SE and are less anxious. This study provides some initial evidence that nurse-led support and education during this period, particularly the first week following PCI, is beneficial and could be trialled using alternate modes of communication to support remote and rural PCI patients and extend to other cardiovascular patients.