892 resultados para Patterns of failure
Resumo:
PURPOSE: To assess the outcome and patterns of failure in patients with testicular lymphoma treated by chemotherapy (CT) and/or radiation therapy (RT). METHODS AND MATERIALS: Data from a series of 36 adult patients with Ann Arbor Stage I (n = 21), II (n = 9), III (n = 3), or IV (n = 3) primary testicular lymphoma, consecutively treated between 1980 and 1999, were collected in a retrospective multicenter study by the Rare Cancer Network. Median age was 64 years (range: 21-91 years). Full staging workup (chest X-ray, testicular ultrasound, abdominal ultrasound, and/or thoracoabdominal computer tomography, bone marrow assessment, full blood count, lactate dehydrogenase, and cerebrospinal fluid evaluation) was completed in 18 (50%) patients. All but one patient underwent orchidectomy, and spermatic cord infiltration was found in 9 patients. Most patients (n = 29) had CT, consisting in most cases of cyclophosphamide, doxorubicin, vincristine, and prednisolone (CHOP) with (n = 17) or without intrathecal CT. External RT was delivered to scrotum alone (n = 12) or testicular, iliac, and para-aortic regions (n = 8). The median RT dose was 31 Gy (range: 20-44 Gy) in a median of 17 fractions (10-24), using a median of 1.8 Gy (range: 1.5-2.5 Gy) per fraction. The median follow-up period was 42 months (range: 6-138 months). RESULTS: After a median period of 11 months (range: 1-76 months), 14 patients presented lymphoma progression, mostly in the central nervous system (CNS) (n = 8). Among the 17 patients who received intrathecal CT, 4 had a CNS relapse (p = NS). No testicular, iliac, or para-aortic relapse was observed in patients receiving RT to these regions. The 5-year overall, lymphoma-specific, and disease-free survival was 47%, 66%, and 43%, respectively. In univariate analyses, statistically significant factors favorably influencing the outcome were early-stage and combined modality treatment. Neither RT technique nor total dose influenced the outcome. Multivariate analysis revealed that the most favorable independent factors predicting the outcome were younger age, early-stage disease, and combined modality treatment. CONCLUSIONS: In this multicenter retrospective study, CNS was found to be the principal site of relapse, and no extra-CNS lymphoma progression was observed in the irradiated volumes. More effective CNS prophylaxis, including combined modalities, should be prospectively explored in this uncommon site of extranodal lymphoma.
Resumo:
BACKGROUND: To evaluate the outcome of patients with carcinoma of anal margin in terms of recurrence, survival, and radiation toxicity. METHODS: A series of 45 consecutive patients, with anal margin carcinoma treated between 1983 and 2006 with curative intent at two institutions, was retrospectively analyzed. A surgical excision (close or positive surgical margin in 22 out of 29 patients) was realized before radiotherapy (RT). RT consisted of definitive external beam RT (EBRT) in 36 patients, brachytherapy (BT) alone in two patients, and both BT and EBRT in seven patients. The median total radiation dose was 59.4 Gy (range, 30-74 Gy). RESULTS: The 5-year locoregional control (LRC) rate was 78% [95% confidence interval (CI), 64-93%]. The 5-year disease-specific survival (DSS) and overall survival (OS) rates were respectively 86% (95% CI, 72-99%) and 55% (95% CI, 44-66%). The overall anal conservation rate was 80% for the whole series. There was no significant association between local recurrence and patient age, histological grade, tumor size, T stage, overall treatment time, RT dose, or chemotherapy. Long-term side effects were observed in 15 patients (33%). Only three patients developed grade 3-4 late toxicity (CTCAE/NCI v3.0). Significant relationship was found between dose, and complication rate (48% for dose >or=59.4 Gy versus 8% for dose < 59.4 Gy; P = 0.03). CONCLUSIONS: We conclude that definitive RT and/or BT yield a good local control and disease-specific survival comparable with published data. This study suggests that radiation dose over 59.4 Gy seems to increase treatment-related morbidity.
Resumo:
Background To determine the outcome and patterns of failure in oral cavity cancer (OCC) patients after postoperative intensity modulated radiotherapy (IMRT) with concomitant systemic therapy. Methods All patients with locally advanced (AJCC stage III/IV) or high-risk OCC (AJCC stage II) who underwent postoperative IMRT at our institution between December 2006 and July 2010 were retrospectively analyzed. The primary endpoint was locoregional recurrence-free survival (LRRFS). Secondary endpoints included distant metastasis-free survival (DMFS), overall survival (OS), acute and late toxicities. Results Overall 53 patients were analyzed. Twenty-three patients (43%) underwent concomitant chemotherapy with cisplatin, two patients with carboplatin (4%) and four patients were treated with the monoclonal antibody cetuximab (8%). At a median follow-up of 2.3 (range, 1.1–4.6) years the 3-year LRRFS, DMFS and OS estimates were 79%, 90%, and 73% respectively. Twelve patients experienced a locoregional recurrence. Eight patients, 5 of which had both a flap reconstruction and extracapsular extension (ECE), showed an unusual multifocal pattern of recurrence. Ten locoregional recurrences occurred marginally or outside of the high-risk target volumes. Acute toxicity grades of 2 (27%) and 3 (66%) and late toxicity grades of 2 (34%) and 3 (11%) were observed. Conclusion LRRFS after postoperative IMRT is satisfying and toxicity is acceptable. The majority of locoregional recurrences occurred marginally or outside of the high-risk target volumes. Improvement of high-risk target volume definition especially in patients with flap reconstruction and ECE might transfer into better locoregional control.
Resumo:
PURPOSE: To assess the outcomes and patterns of failure in solitary plasmacytoma (SP). METHODS AND MATERIALS: The data from 258 patients with bone (n = 206) or extramedullary (n = 52) SP without evidence of multiple myeloma (MM) were collected. A histopathologic diagnosis was obtained for all patients. Most (n = 214) of the patients received radiotherapy (RT) alone; 34 received chemotherapy and RT, and 8 surgery alone. The median radiation dose was 40 Gy. The median follow-up was 56 months (range 7-245). RESULTS: The median time to MM development was 21 months (range 2-135), with a 5-year probability of 45%. The 5-year overall survival, disease-free survival, and local control rate was 74%, 50%, and 86%, respectively. On multivariate analyses, the favorable factors were younger age and tumor size <4 cm for survival; younger age, extramedullary localization, and RT for disease-free survival; and small tumor and RT for local control. Bone localization was the only predictor of MM development. No dose-response relationship was found for doses >30 Gy, even for larger tumors. CONCLUSION: Progression to MM remains the main problem. Patients with extramedullary SP had the best outcomes, especially when treated with moderate-dose RT. Chemotherapy and/or novel therapies should be investigated for bone or bulky extramedullary SP.
Resumo:
ABSTRACT:: Adherence patterns and their influence on virologic outcome are well characterized for protease inhibitor (PI)- and non-nucleoside reverse transcriptase inhibitor (NNRTI)-based regimens. We aimed to determine how patterns of adherence to raltegravir influence the risk of virological failure. We conducted a prospective multicenter cohort following 81 HIV-infected antiretroviral-naive or experienced subjects receiving or starting twice-a-day raltegravir-based antiretroviral therapy. Their adherence patterns were monitored using the Medication Events Monitoring System. During follow-up (188 days, ±77), 12 (15%) of 81 subjects experienced virological failure. Longer treatment interruption [adjusted odds ratio per 24-hour increase: 2.4; 95% confidence interval: 1.2 to 6.9; P < 0.02] and average adherence (odds ratio per 5% increase: 0.68; 95% confidence interval: 0.46 to 1.00, P < 0.05) were both independently associated with virological failure controlling for prior duration of viral suppression. Timely interdose intervals and high levels of adherence to raltegravir are both necessary to control HIV replication.
Resumo:
PurposeTo assess clinical outcomes and patterns of loco-regional failure (LRF) in relation to clinical target volumes (CTV) in patients with locally advanced hypopharyngeal and laryngeal squamous cell carcinoma (HL-SCC) treated with definitive intensity modulated radiotherapy (IMRT) and concurrent systemic therapy.MethodsData from HL-SCC patients treated from 2007 to 2010 were retrospectively evaluated. Primary endpoint was loco-regional control (LRC). Secondary endpoints included local (LC) and regional (RC) controls, distant metastasis free survival (DMFS), laryngectomy free survival (LFS), overall survival (OS), and acute and late toxicities. Time-to-event endpoints were estimated using Kaplan-Meier method, and univariate and multivariate analyses were performed using Cox proportional hazards models. Recurrent gross tumor volume (RTV) on post-treatment diagnostic imaging was analyzed in relation to corresponding CTV (in-volume, > 95% of RTV inside CTV; marginal, 20¿95% inside CTV; out-volume, < 20% inside CTV).ResultsFifty patients (stage III: 14, IVa: 33, IVb: 3) completed treatment and were included in the analysis (median follow-up of 4.2 years). Three-year LRC, DMFS and overall survival (OS) were 77%, 96% and 63%, respectively. Grade 2 and 3 acute toxicity were 38% and 62%, respectively; grade 2 and 3 late toxicity were 23% and 15%, respectively. We identified 10 patients with LRF (8 local, 1 regional, 1 local¿+¿regional). Six out of 10 RTVs were fully included in both elective and high-dose CTVs, and 4 RTVs were marginal to the high-dose CTVs.ConclusionThe treatment of locally advanced HL-SCC with definitive IMRT and concurrent systemic therapy provides good LRC rates with acceptable toxicity profile. Nevertheless, the analysis of LRFs in relation to CTVs showed in-volume relapses to be the major mode of recurrence indicating that novel strategies to overcome radioresistance are required.
Resumo:
Resistance to drug is a major cause of treatment failure in pediatric brain cancer. The multidrug resistance (MDR) phenotype can be mediated by the superfamily of adenosine triphosphate-binding cassette (ABC) transporters. The dynamics of expression of the MDR genes after exposure to chemotherapy, especially the comparison between pediatric brain tumors of different histology, is poorly described. To compare the expression profiles of the multidrug resistance genes ABCB1, ABCC1, and ABCG2 in different neuroepithelial pediatric brain tumor cell lines prior and following short-term culture with vinblastine. Immortalized lineages from pilocytic astrocytoma (R286), anaplasic astrocytoma (UW467), glioblastoma (SF188), and medulloblastoma (UW3) were exposed to vinblastine sulphate at different schedules (10 and 60 nM for 24 and 72 h). Relative amounts of mRNA expression were analyzed by real-time quantitative polymerase chain reaction. Protein expression was assessed by immunohistochemistry for ABCB1, ABCC1, and ABCG2. mRNA expression of ABCB1 increased together with augmenting concentration and time of exposure to vinblastine for R286, UW467, and UW3 cell lines. Interestingly, ABCB1 levels of expression diminished in SF188. Following chemotherapy, mRNA expression of ABCC1 decreased in all cell lines other than glioblastoma. ABCG2 expression was influenced by vinblastine only for UW3. The mRNA levels showed consistent association to protein expression in the selected sets of cell lines analyzed. The pediatric glioblastoma cell line SF188 shows different pattern of expression of multidrug resistance genes when exposed to vinblastine. These preliminary findings may be useful in determining novel strategies of treatment for neuroepithelial pediatric brain tumors.
Resumo:
BACKGROUND: Mortality among HIV-infected persons is decreasing, and causes of death are changing. Classification of deaths is hampered because of low autopsy rates, frequent deaths outside of hospitals, and shortcomings of International Statistical Classification of Diseases and Related Health Problems (ICD-10) coding. METHODS: We studied mortality among Swiss HIV Cohort Study (SHCS) participants (1988-2010) and causes of death using the Coding Causes of Death in HIV (CoDe) protocol (2005-2009). Furthermore, we linked the SHCS data to the Swiss National Cohort (SNC) cause of death registry. RESULTS: AIDS-related mortality peaked in 1992 [11.0/100 person-years (PY)] and decreased to 0.144/100 PY (2006); non-AIDS-related mortality ranged between 1.74 (1993) and 0.776/100 PY (2006); mortality of unknown cause ranged between 2.33 and 0.206/100 PY. From 2005 to 2009, 459 of 9053 participants (5.1%) died. Underlying causes of deaths were: non-AIDS malignancies [total, 85 (19%) of 446 deceased persons with known hepatitis C virus (HCV) status; HCV-negative persons, 59 (24%); HCV-coinfected persons, 26 (13%)]; AIDS [73 (16%); 50 (21%); 23 (11%)]; liver failure [67 (15%); 12 (5%); 55 (27%)]; non-AIDS infections [42 (9%); 13 (5%); 29 (14%)]; substance use [31 (7%); 9 (4%); 22 (11%)]; suicide [28 (6%); 17 (7%), 11 (6%)]; myocardial infarction [28 (6%); 24 (10%), 4 (2%)]. Characteristics of deceased persons differed in 2005 vs. 2009: median age (45 vs. 49 years, respectively); median CD4 count (257 vs. 321 cells/μL, respectively); the percentage of individuals who were antiretroviral therapy-naïve (13 vs. 5%, respectively); the percentage of deaths that were AIDS-related (23 vs. 9%, respectively); and the percentage of deaths from non-AIDS-related malignancies (13 vs. 24%, respectively). Concordance in the classification of deaths was 72% between CoDe and ICD-10 coding in the SHCS; and 60% between the SHCS and the SNC registry. CONCLUSIONS: Mortality in HIV-positive persons decreased to 1.33/100 PY in 2010. Hepatitis B or C virus coinfections increased the risk of death. Between 2005 and 2009, 84% of deaths were non-AIDS-related. Causes of deaths varied according to data source and coding system.
Resumo:
In the northeastern United States, grassland birds regularly use agricultural fields as nesting habitat. However, birds that nest in these fields regularly experience nest failure as a result of agricultural practices, such as mowing and grazing. Therefore, information on both spatial and temporal patterns of habitat use is needed to effectively manage these species. We addressed these complex habitat use patterns by conducting point counts during three time intervals between May 21, 2002 and July 2, 2002 in agricultural fields across the Champlain Valley in Vermont and New York. Early in the breeding season, Bobolinks (Dolichonyx oryzivorus) used fields in which the landscape within 2500 m was dominated by open habitats. As mowing began, suitable habitat within 500 m became more important. Savannah Sparrows (Passerculus sandwichensis) initially used fields that contained a high proportion of suitable habitat within 500 m. After mowing, features of the field (i.e., size and amount of woody edge) became more important. Each species responded differently to mowing: Savannah Sparrows were equally abundant in mowed and uncut fields, whereas Bobolinks were more abundant in uncut fields. In agricultural areas in the Northeast, large areas (2000 ha) that are mostly nonforested and undeveloped should be targeted for conservation. Within large open areas, smaller patches (80 ha) should be maintained as high-quality, late-cut grassland habitat.
Resumo:
Patterns of colonization by queens and incipient nest survival of the leaf-cutting ants Acromyrmex niger and Acromyrmex balzani were studied by means of belt transects and individually marked incipient nests. No relation was found between colony density and the number of colonization attempts. Both species are not claustral, and high rates of queen mortality were attributed to conspecific executions and predation. of other discernable mortality factors, failure of fungal garden establishment was the most important. Only 34 of 296 A. balzani and 13 of 154 A. niger marked colonies were alive at the end of one year. These figures are higher than those reported for species of Atta. These results are contrasted with those of claustral-founding Atta species. Small colonies are occasionally raided by larger colonies which robbed brood.
Resumo:
Background and Objectives: Patients who survive acute kidney injury (AKI), especially those with partial renal recovery, present a higher long-term mortality risk. However, there is no consensus on the best time to assess renal function after an episode of acute kidney injury or agreement on the definition of renal recovery. In addition, only limited data regarding predictors of recovery are available. Design, Setting, Participants, & Measurements: From 1984 to 2009, 84 adult survivors of acute kidney injury were followed by the same nephrologist (RCRMA) for a median time of 4.1 years. Patients were seen at least once each year after discharge until end stage renal disease (ESRD) or death. In each consultation serum creatinine was measured and glomerular filtration rate estimated. Renal recovery was defined as a glomerular filtration rate value >= 60 mL/min/1.73 m2. A multiple logistic regression was performed to evaluate factors independently associated with renal recovery. Results: The median length of follow-up was 50 months (30-90 months). All patients had stabilized their glomerular filtration rates by 18 months and 83% of them stabilized earlier: up to 12 months. Renal recovery occurred in 16 patients (19%) at discharge and in 54 (64%) by 18 months. Six patients died and four patients progressed to ESRD during the follow up period. Age (OR 1.09, p < 0.0001) and serum creatinine at hospital discharge (OR 2.48, p = 0.007) were independent factors associated with non renal recovery. The acute kidney injury severity, evaluated by peak serum creatinine and need for dialysis, was not associated with non renal recovery. Conclusions: Renal recovery must be evaluated no earlier than one year after an acute kidney injury episode. Nephrology referral should be considered mainly for older patients and those with elevated serum creatinine at hospital discharge.
Resumo:
PURPOSE: Neutral endopeptidase (CD10), an ectopeptidase bound to the cell surface, is thought to be a potential prognostic marker for prostate cancer. EXPERIMENTAL DESIGN: Prostate cancer patients (N = 3,261) treated by radical prostatectomy at a single institution were evaluated by using tissue microarray. Follow-up data were available for 2,385 patients. The cellular domain (membranous, membranous-cytoplasmatic, and cytoplasmatic only) of CD10 expression was analyzed immunohistochemically and correlated with various clinical and histopathologic features of the tumors. RESULTS: CD10 expression was detected in 62.2% of cancer samples and occurred preferentially in higher Gleason pattern (P < 0.0001). CD10 expression positively correlated with adverse tumor features such as elevated preoperative prostate-specific antigen (PSA), higher Gleason score, and advanced stage (P < 0.0001 each). Survival analyses showed that PSA recurrence was significantly associated with the staining pattern of CD10 expression. Outcome significantly declined from negative over membranous, membranous-cytoplasmatic, to exclusively cytoplasmatic CD10 expression (P < 0.0001). In multivariate analysis, CD10 expression was an independent predictor for PSA failure (P = 0.0343). CONCLUSIONS: CD10 expression is an unfavorable independent risk factor in prostate cancer. The subcellular location of CD10 protein is associated with specific clinical courses, suggesting an effect on different important biological properties of prostate cancer cells. The frequent expression of CD10 in prostate cancer and the strong association of CD10 with unfavorable tumor features may qualify this biomarker for targeted therapies.
Resumo:
The current therapeutic strategy in breast cancer is to identify a target, such as estrogen receptor (ER) status, for tailoring treatments. We investigated the patterns of recurrence with respect to ER status for patients treated in two randomized trials with 25 years' median follow-up. In the ER-negative subpopulations most breast cancer events occurred within the first 5-7 years after randomization, while in the ER-positive subpopulations breast cancer events were spread through 10 years. In the ER-positive subpopulation, 1 year endocrine treatment alone significantly prolonged disease-free survival (DFS) with no additional benefit observed by adding 1 year of chemotherapy. In the small ER-negative subpopulation chemo-endocrine therapy had a significantly better DFS than endocrine alone or no treatment. Despite small numbers of patients, "old-fashioned" treatments, and competing causes of treatment failure, the value of ER status as a target for response to adjuvant treatment is evident through prolonged follow-up.
Resumo:
Trees from tropical montane cloud forest (TMCF) display very dynamic patterns of water use. They are capable of downwards water transport towards the soil during leaf-wetting events, likely a consequence of foliar water uptake (FWU), as well as high rates of night-time transpiration (Enight) during drier nights. These two processes might represent important sources of water losses and gains to the plant, but little is known about the environmental factors controlling these water fluxes. We evaluated how contrasting atmospheric and soil water conditions control diurnal, nocturnal and seasonal dynamics of sap flow in Drimys brasiliensis (Miers), a common Neotropical cloud forest species. We monitored the seasonal variation of soil water content, micrometeorological conditions and sap flow of D. brasiliensis trees in the field during wet and dry seasons. We also conducted a greenhouse experiment exposing D. brasiliensis saplings under contrasting soil water conditions to deuterium-labelled fog water. We found that during the night D. brasiliensis possesses heightened stomatal sensitivity to soil drought and vapour pressure deficit, which reduces night-time water loss. Leaf-wetting events had a strong suppressive effect on tree transpiration (E). Foliar water uptake increased in magnitude with drier soil and during longer leaf-wetting events. The difference between diurnal and nocturnal stomatal behaviour in D. brasiliensis could be attributed to an optimization of carbon gain when leaves are dry, as well as minimization of nocturnal water loss. The leaf-wetting events on the other hand seem important to D. brasiliensis water balance, especially during soil droughts, both by suppressing tree transpiration (E) and as a small additional water supply through FWU. Our results suggest that decreases in leaf-wetting events in TMCF might increase D. brasiliensis water loss and decrease its water gains, which could compromise its ecophysiological performance and survival during dry periods.