8 resultados para PREOPERATIVE RADIATION-THERAPY
Resumo:
INTRODUCTION Radiotherapy outcomes might be further improved by a greater understanding of the individual variations in normal tissue reactions that determine tolerance. Most published studies on radiation toxicity have been performed retrospectively. Our prospective study was launched in 1996 to measure the in vitro radiosensitivity of peripheral blood lymphocytes before treatment with radical radiotherapy in patients with breast cancer, and to assess the early and the late radiation skin side effects in the same group of patients. We prospectively recruited consecutive breast cancer patients receiving radiation therapy after breast surgery. To evaluate whether early and late side effects of radiotherapy can be predicted by the assay, a study was conducted of the association between the results of in vitro radiosensitivity tests and acute and late adverse radiation effects. METHODS Intrinsic molecular radiosensitivity was measured by using an initial radiation-induced DNA damage assay on lymphocytes obtained from breast cancer patients before radiotherapy. Acute reactions were assessed in 108 of these patients on the last treatment day. Late morbidity was assessed after 7 years of follow-up in some of these patients. The Radiation Therapy Oncology Group (RTOG) morbidity score system was used for both assessments. RESULTS Radiosensitivity values obtained using the in vitro test showed no relation with the acute or late adverse skin reactions observed. There was no evidence of a relation between acute and late normal tissue reactions assessed in the same patients. A positive relation was found between the treatment volume and both early and late side effects. CONCLUSION After radiation treatment, a number of cells containing major changes can have a long survival and disappear very slowly, becoming a chronic focus of immunological system stimulation. This stimulation can produce, in a stochastic manner, late radiation-related adverse effects of varying severity. Further research is warranted to identify the major determinants of normal tissue radiation response to make it possible to individualize treatments and improve the outcome of radiotherapy in cancer patients.
Resumo:
BACKGROUND. Either higher levels of initial DNA damage or lower levels of radiation-induced apoptosis in peripheral blood lymphocytes have been associated to increased risk for develop late radiation-induced toxicity. It has been recently published that these two predictive tests are inversely related. The aim of the present study was to investigate the combined role of both tests in relation to clinical radiation-induced toxicity in a set of breast cancer patients treated with high dose hyperfractionated radical radiotherapy. METHODS. Peripheral blood lymphocytes were taken from 26 consecutive patients with locally advanced breast carcinoma treated with high-dose hyperfractioned radical radiotherapy. Acute and late cutaneous and subcutaneous toxicity was evaluated using the Radiation Therapy Oncology Group morbidity scoring schema. The mean follow-up of survivors (n = 13) was 197.23 months. Radiosensitivity of lymphocytes was quantified as the initial number of DNA double-strand breaks induced per Gy and per DNA unit (200 Mbp). Radiation-induced apoptosis (RIA) at 1, 2 and 8 Gy was measured by flow cytometry using annexin V/propidium iodide. RESULTS. Mean DSB/Gy/DNA unit obtained was 1.70 ± 0.83 (range 0.63-4.08; median, 1.46). Radiation-induced apoptosis increased with radiation dose (median 12.36, 17.79 and 24.83 for 1, 2, and 8 Gy respectively). We observed that those "expected resistant patients" (DSB values lower than 1.78 DSB/Gy per 200 Mbp and RIA values over 9.58, 14.40 or 24.83 for 1, 2 and 8 Gy respectively) were at low risk of suffer severe subcutaneous late toxicity (HR 0.223, 95%CI 0.073-0.678, P = 0.008; HR 0.206, 95%CI 0.063-0.677, P = 0.009; HR 0.239, 95%CI 0.062-0.929, P = 0.039, for RIA at 1, 2 and 8 Gy respectively) in multivariate analysis. CONCLUSIONS. A radiation-resistant profile is proposed, where those patients who presented lower levels of initial DNA damage and higher levels of radiation induced apoptosis were at low risk of suffer severe subcutaneous late toxicity after clinical treatment at high radiation doses in our series. However, due to the small sample size, other prospective studies with higher number of patients are needed to validate these results.
Resumo:
Recent advances that have been made in our understanding of cancer biology and immunology show that infiltrated immune cells and cytokines in the tumor microenvironment may play different functions that appear tightly related to clinical outcomes. Strategies aimed at interfering with the cross-talk between microenvironment tumor cells and their cellular partners have been considered for the development of new immunotherapies. These novel therapies target different cell components of the tumor microenvironment and importantly, they may be coupled and boosted with classical treatments, such as radiotherapy. In this work, we try to summarize recent data on the microenvironment impact of radiation therapy, from pre-clinical research to the clinic, while taking into account that this new knowledge will probably translate into indication and objective of radiation therapy changes in the next future.
Resumo:
INTRODUCTION Metastases are detected in 20% of patients with solid tumours at diagnosis and a further 30% after diagnosis. Radiation therapy (RT) has proven effective in bone (BM) and brain (BrM) metastases. The objective of this study was to analyze the variability of RT utilization rates in clinical practice and the accessibility to medical technology in our region. PATIENTS AND METHODS We reviewed the clinical records and RT treatment sheets of all patients undergoing RT for BM and/or BrM during 2007 in the 12 public hospitals in an autonomous region of Spain. Data were gathered on hospital type, patient type and RT treatment characteristics. Calculation of the rate of RT use was based on the cancer incidence and the number of RT treatments for BM, BrM and all cancer sites. RESULTS Out of the 9319 patients undergoing RT during 2007 for cancer at any site, 1242 (13.3%; inter-hospital range, 26.3%) received RT for BM (n = 744) or BrM (n = 498). These 1242 patients represented 79% of all RT treatments with palliative intent, and the most frequent primary tumours were in lung, breast, prostate or digestive system. No significant difference between BM and BrM groups were observed in: mean age (62 vs. 59 yrs, respectively); gender (approximately 64% male and 36% female in both); performance status (ECOG 0-1 in 70 vs. 71%); or mean distance from hospital (36 vs. 28.6 km) or time from consultation to RT treatment (13 vs. 14.3 days). RT regimens differed among hospitals and between patient groups: 10 × 300 cGy, 5 × 400 cGy and 1x800cGy were applied in 32, 27 and 25%, respectively, of BM patients, whereas 10 × 300cGy was used in 49% of BrM patients. CONCLUSIONS Palliative RT use in BM and BrM is high and close to the expected rate, unlike the global rate of RT application for all cancers in our setting. Differences in RT schedules among hospitals may reflect variability in clinical practice among the medical teams.
Resumo:
Background: We aim to investigate the possibility of using 18F-positron emission tomography/computer tomography (PET-CT) to predict the histopathologic response in locally advanced rectal cancer (LARC) treated with preoperative chemoradiation (CRT). Methods: The study included 50 patients with LARC treated with preoperative CRT. All patients were evaluated by PET-CT before and after CRT, and results were compared to histopathologic response quantified by tumour regression grade (patients with TRG 1-2 being defined as responders and patients with grade 3-5 as non-responders). Furthermore, the predictive value of metabolic imaging for pathologic complete response (ypCR) was investigated. Results: Responders and non-responders showed statistically significant differences according to Mandard's criteria for maximum standardized uptake value (SUVmax) before and after CRT with a specificity of 76,6% and a positive predictive value of 66,7%. Furthermore, SUVmax values after CRT were able to differentiate patients with ypCR with a sensitivity of 63% and a specificity of 74,4% (positive predictive value 41,2% and negative predictive value 87,9%); This rather low sensitivity and specificity determined that PET-CT was only able to distinguish 7 cases of ypCR from a total of 11 patients. Conclusions: We conclude that 18-F PET-CT performed five to seven weeks after the end of CRT can visualise functional tumour response in LARC. In contrast, metabolic imaging with 18-F PET-CT is not able to predict patients with ypCR accurately
Resumo:
Perioperative anaemia, with iron deficiency being its leading cause, is a frequent condition among surgical patients, and has been linked to increased postoperative morbidity and mortality, and decreased quality of life. Postoperative anaemia is even more frequent and is mainly caused by perioperative blood loss, aggravated by inflammation-induced blunting of erythropoiesis. Allogenic transfusion is commonly used for treating acute perioperative anaemia, but it also increases the rate of morbidity and mortality in surgical and critically ill patients. Thus, overall concerns about adverse effects of both preoperative anaemia and allogeneic transfusion have prompted the review of transfusion practice and the search for safer and more biologically rational treatment options. In this paper, the role of intravenous iron therapy (mostly with iron sucrose and ferric carboxymaltose), as a safe and efficacious tool for treating anaemia and reducing transfusion requirements in surgical patients, as well as in other medical areas, has been reviewed. From the analysis of published data and despite the lack of high quality evidence in some areas, it seems fair to conclude that perioperative intravenous iron administration, with or without erythropoiesis stimulating agents, is safe, results in lower transfusion requirements and hastens recovery from postoperative anaemia. In addition, some studies have reported decreased rates of postoperative infection and mortality, and shorter length of hospital stay in surgical patients receiving intravenous iron.
Resumo:
BACKGROUND Preoperative chemoradiotherapy (CRT) is the cornerstone of treatment for locally advanced rectal cancer (LARC). Although high local control is achieved, overall rates of distant control remain suboptimal. Colorectal carcinogenesis is associated with critical alterations of the Wnt/β-catenin pathway involved in proliferation and survival. The aim of this study was to assess whether CRT induces changes in the expression of β-catenin/E-cadherin, and to determine whether these changes are associated with survival. METHODS The Immunohistochemical expression of nuclear β-catenin and membranous E-cadherin was prospectively analysed in tumour blocks from 98 stage II/III rectal cancer patients treated with preoperative CRT. Tumour samples were collected before and after CRT treatment. All patients were treated with pelvic RT (46-50 Gy in 2 Gy fractions) and 5-fluorouracil (5FU) intravenous infusion (225 mg/m2) or capecitabine (825 mg/m2) during RT treatment, followed by total mesorectal excision (TME). Disease-free survival (DFS) was analysed using the Kaplan-Meier method and a multivariate Cox regression model was employed for the Multivariate analysis. RESULTS CRT induced significant changes in the expression of nuclear β-catenin (49% of patients presented an increased expression after CRT, 17% a decreased expression and 34% no changes; p = 0.001). After a median follow-up of 25 months, patients that overexpressed nuclear β-catenin after CRT showed poor survival compared with patients that experienced a decrease in nuclear β-catenin expression (3-year DFS 92% vs. 43%, HR 0.17; 95% CI 0.03 to 0.8; p = 0.02). In the multivariate analysis for DFS, increased nuclear β-catenin expression after CRT almost reached the cut-off for significance (p = 0.06). CONCLUSIONS In our study, preoperative CRT for LARC induced significant changes in nuclear β-catenin expression, which had a major impact on survival. Finding a way to decrease CRT resistance would significantly improve LARC patient survival.
Resumo:
To date, no effective method exists that predicts the response to preoperative chemoradiation (CRT) in locally advanced rectal cancer (LARC). Nevertheless, identification of patients who have a higher likelihood of responding to preoperative CRT could be crucial in decreasing treatment morbidity and avoiding expensive and time-consuming treatments. The aim of this study was to identify signatures or molecular markers related to response to pre-operative CRT in LARC. We analyzed the gene expression profiles of 26 pre-treatment biopsies of LARC (10 responders and 16 non-responders) without metastasis using Human WG CodeLink microarray platform. Two hundred and fifty seven genes were differentially over-expressed in the responder patient subgroup. Ingenuity Pathway Analysis revealed a significant ratio of differentially expressed genes related to cancer, cellular growth and proliferation pathways, and c-Myc network. We demonstrated that high Gng4, c-Myc, Pola1, and Rrm1 mRNA expression levels was a significant prognostic factor for response to treatment in LARC patients (p<0.05). Using this gene set, we were able to establish a new model for predicting the response to CRT in rectal cancer with a sensitivity of 60% and 100% specificity. Our results reflect the value of gene expression profiling to gain insight about the molecular pathways involved in the response to treatment of LARC patients. These findings could be clinically relevant and support the use of mRNA levels when aiming to identify patients who respond to CRT therapy.