881 resultados para RISK-ADAPTED TREATMENT
Resumo:
BACKGROUND Magnetic resonance imaging (MRI) of the prostate is considered to be the most precise noninvasive staging modality for localized prostate cancer. Multiparametric MRI (mpMRI) dynamic sequences have recently been shown to further increase the accuracy of staging relative to morphological imaging alone. Correct radiological staging, particularly the detection of extraprostatic disease extension, is of paramount importance for target volume definition and dose prescription in highly-conformal curative radiotherapy (RT); in addition, it may affect the risk-adapted duration of additional antihormonal therapy. The purpose of our study was to analyze the impact of mpMRI-based tumor staging in patients undergoing primary RT for prostate cancer. METHODS A total of 122 patients admitted for primary RT for prostate cancer were retrospectively analyzed regarding initial clinical and computed tomography-based staging in comparison with mpMRI staging. Both tumor stage shifts and overall risk group shifts, including prostate-specific antigen (PSA) level and the Gleason score, were assessed. Potential risk factors for upstaging were tested in a multivariate analysis. Finally, the impact of mpMRI-based staging shift on prostate RT and antihormonal therapy was evaluated. RESULTS Overall, tumor stage shift occurred in 55.7% of patients after mpMRI. Upstaging was most prominent in patients showing high-risk serum PSA levels (73%), but was also substantial in patients presenting with low-risk PSA levels (50%) and low-risk Gleason scores (45.2%). Risk group changes occurred in 28.7% of the patients with consequent treatment adaptations regarding target volume delineation and duration of androgen deprivation therapy. High PSA levels were found to be a significant risk factor for tumor upstaging and newly diagnosed seminal vesicle infiltration assessed using mpMRI. CONCLUSIONS Our findings suggest that mpMRI of the prostate leads to substantial tumor upstaging, and can considerably affect treatment decisions in all patient groups undergoing risk-adapted curative RT for prostate cancer.
Resumo:
Introduction: The aim was to confirm that PSF (probability of stone formation) changed appropriately following medical therapy on recurrent stone formers.Materials and Methods: Data were collected on 26 Brazilian stone-formers. A baseline 24-hour urine collection was performed prior to treatment. Details of the medical treatment initiated for stone-disease were recorded. A PSF calculation was performed on the 24 hour urine sample using the 7 urinary parameters required: voided volume, oxalate, calcium, urate, pH, citrate and magnesium. A repeat 24-hour urine sample was performed for PSF calculation after treatment. Comparison was made between the PSF scores before and during treatment.Results: At baseline, 20 of the 26 patients (77%) had a high PSF score (> 0.5). Of the 26 patients, 17 (65%) showed an overall reduction in their PSF profiles with a medical treatment regimen. Eleven patients (42%) changed from a high risk (PSF > 0.5) to a low risk (PSF < 0.5) and 6 patients reduced their risk score but did not change risk category. Six (23%) patients remained in a high risk category (> 0.5) during both assessments.Conclusions: The PSF score reduced following medical treatment in the majority of patients in this cohort.
Resumo:
Susceptibility to acute lymphoblastic leukemia can be highly influenced by genetic polymorphisms in metabolizing enzyme genes of environmental carcinogens. This study aimed to evaluate the impact of the CYP3A5 and NAT2 metabolizing enzyme polymorphisms on the risk of childhood acute lymphoblastic leukemia. The analysis was conducted on 204 ALL patients and in 364 controls from a Brazilian population, using PCR-RFLP. The CYP3A5*3 polymorphic homozygous genotype was more frequent among ALL patients and the *3 allele variant was significantly associated with increased risk of childhood ALL (OR = 0.29; 95% CI, 0.14-0.60). The homozygous polymorphic genotype for the *6 allele variant was extremely rare and found in only two individuals. The heterozygous frequencies were similar for the ALL group and the control group. No significant differences were observed between the groups analyzed regarding NAT2 variant polymorphisms. None of the polymorphisms analyzed was related to treatment outcome. The results suggest that CYP3A5*3 polymorphism may play an important role in the risk of childhood ALL.
Resumo:
Objectives: Several clinical trials conducted in Europe and US reported favorable outcomes of patients with APL treated with the combination of all trans retinoic acid (ATRA) and anthracyclines. Nevertheless, the results observed in developing countries with the same regimen was poorer, mainly due to high early mortality mainly due bleeding. The International Consortium on Acute Promyelocytic Leukemia (IC-APL) is an initiative of the International Members Committee of the ASH and the project aims to reduce this gap through the establishment of international network, which was launched in Brazil, Mexico and Uruguay. Methods: The IC-APL treatment protocol is similar to the PETHEMA 2005, but changing idarubicin to daunorubicin. All patients with a suspected diagnosis of APL were immediately started on ATRA, while bone marrow samples were shipped to a national central lab where genetic verification of the diagnosis was performed. The immunofluorescence using an anti-PML antibody allowed a rapid confirmation of the diagnosis and, the importance of supportive measures was reinforced. Results: The interim analysis of 97 patients enrolled in the IC-APL protocol showed that complete remission (CR) rate was 83% and the 2-year overall survival and disease-free survival were 80% and 90%, respectively. Of note, the early mortality rate was reduced to 7.5%. Discussion: The results of IC-APL demonstrate the impact of educational programs and networking on the improvement of the leukemia treatment outcome in developing countries.
Resumo:
BACKGROUND AND PURPOSE: Sleep-disordered breathing (SDB) is frequent in stroke patients. Risk factors, treatment response, short-term and long-term outcome of SDB in stroke patients are poorly known. METHODS: We prospectively studied 152 patients (mean age 56+/-13 years) with acute ischemic stroke. Cardiovascular risk factors, Epworth sleepiness score (ESS), stroke severity/etiology, and time of stroke onset were assessed. The apnea-hypopnea index (AHI) was determined 3+/-2 days after stroke onset and 6 months later (subacute phase). Continuous positive airway pressure (CPAP) treatment was started acutely in patients with SDB (AHI > or =15 or AHI > or =10+ESS >10). CPAP compliance, incidence of vascular events, and stroke outcome were assessed 60+/-16 months later (chronic phase). RESULTS: Initial AHI was 18+/-16 (> or =10 in 58%, > or =30 in 17% of patients) and decreased in the subacute phase (P<0.001). Age, diabetes, and nighttime stroke onset were independent predictors of AHI (r2=0.34). In patients with AHI > or =30, age, male gender, body mass index, diabetes, hypertension, coronary heart disease, ESS, and macroangiopathic etiology of stroke were significantly higher/more common than in patients with AHI <10. Long-term incidence of vascular events and stroke outcome were similar in both groups. CPAP was started in 51% and continued chronically in 15% of SDB pts. Long-term stroke mortality was associated with initial AHI, age, hypertension, diabetes, and coronary heart disease. CONCLUSIONS: SDB is common particularly in elderly stroke male patients with diabetes, nighttime stroke onset, and macroangiopathy as cause of stroke; it improves after the acute phase, is associated with an increased poststroke mortality, and can be treated with CPAP in a small percentage of patients.
Resumo:
Developing a robust method to study characteristics of vascular flow using ultrasound may be useful to assess endothelial function and vasodilatation. There are four stages in this proposal. 1.The first stage is to standardise and validate the methodology to enable computational risk flow data and other flow characteristics to be used clinically. (Current Study). Further development of fluid modelling methods will enable particulate haemodynamics to be investigated, and incorporate detailed endothelial structure together with cellular pathways. 2. This should be followed up by studies in different patient groups investigating the association between the derived values and estimated risk (using other methods such as Framingham risk score). 3. Then, associated with underlying cardiovascular risk, prospective studies would be made to establish whether computational flow dynamic data can predict outcome. If successful it could prove to be a very useful marker of benefit following treatment in a clinical setting.
Resumo:
Today, the classification systems for myelodysplastic syndromes (MDS) and acute myeloid leukemia (AML) already incorporate cytogenetic and molecular genetic aberrations in an attempt to better reflect disease biology. However, in many MDS/AML patients no genetic aberrations have been identified yet, and even within some cytogenetically well-defined subclasses there is considerable clinical heterogeneity. Recent advances in genomics technologies such as gene expression profiling (GEP) provide powerful tools to further characterize myeloid malignancies at the molecular level, with the goal to refine the MDS/AML classification system, incorporating as yet unknown molecular genetic and epigenetic pathomechanisms, which are likely reflected by aberrant gene expression patterns. In this study, we provide a comprehensive review on how GEP has contributed to a refined molecular taxonomy of MDS and AML with regard to diagnosis, prediction of clinical outcome, discovery of novel subclasses and identification of novel therapeutic targets and novel drugs. As many challenges remain ahead, we discuss the pitfalls of this technology and its potential including future integrative studies with other genomics technologies, which will continue to improve our understanding of malignant transformation in myeloid malignancies and thereby contribute to individualized risk-adapted treatment strategies for MDS and AML patients. Leukemia (2011) 25, 909-920; doi:10.1038/leu.2011.48; published online 29 March 2011
Resumo:
Increased fibrinolysis is an important component of acute promyelocytic leukemia (APL) bleeding diathesis. APL blasts overexpress annexin II (ANXII), a receptor for tissue plasminogen activator (tPA), and plasminogen, thereby increasing plasmin generation. Previous studies suggested that ANXII plays a pivotal role in APL coagulopathy. ANXII binding to tPA can be inhibited by homocysteine and hyperhomocysteinemia can be induced by L-methionine supplementation. In the present study, we used an APL mouse model to study ANXII function and the effects of hyperhomocysteinemia in vivo. Leukemic cells expressed higher ANXII and tPA plasma levels (11.95 ng/mL in leukemic vs 10.74 ng/mL in wild-type; P = .004). In leukemic mice, administration of L-methionine significantly increased homocysteine levels (49.0 mu mol/mL and < 6.0 mu mol/mL in the treated and nontreated groups, respectively) and reduced tPA levels to baseline concentrations. The latter were also decreased after infusion of the LCKLSL peptide, a competitor for the ANXII tPA-binding site (11.07 ng/mL; P = .001). We also expressed and purified the p36 component of ANXII in Pichia methanolica. The infusion of p36 in wild-type mice increased tPA and thrombin-antithrombin levels, and the latter was reversed by L-methionine administration. The results of the present study demonstrate the relevance of ANXII in vivo and suggest that methionine-induced hyperhomocysteinemia may reverse hyperfibrinolysis in APL. (Blood. 2012;120(1):207-213)
Resumo:
Introduction: We previously reported the results of a phase II study for patients with newly diagnosed primary CNS lymphoma (PCNSL) treated with autologous peripheral blood stem-cell transplantation (aPBSCT) and responseadapted whole brain radiotherapy (WBRT). The purpose of this report is to update the initial results and provide long-term data regarding overall survival, prognostic factors, and the risk of treatment-related neurotoxicity.Methods: A long-term follow-up was conducted on surviving primary central nervous system lymphoma patients having been treated according to the ,,OSHO-53 study", which was initiated by the Ostdeutsche Studiengruppe Hamatologie-Onkologie. Between August 1999 and October 2004 twentythree patients with an average age of 55 and median Karnofsky performance score of 70% were enrolled and received high-dose mthotrexate (HD-MTX) on days 1 and 10. In case of at least a partial remission (PR), high-dose busulfan/ thiotepa (HD-BuTT) followed by aPBSCT was performed. Patients without response to induction or without complete remission (CR) after HD-BuTT received WBRT. All patients (n=8), who are alive in 2011, were contacted and Mini Mental State examination (MMSE) and the EORTC QLQ-C30 were performed.Results: Eight patients are still alive with a median follow-up of 116,9 months (79 - 141, range). One of them suffered from a late relapse eight and a half years after initial diagnosis of PCNSL, another one suffers from a gall bladder carcinoma. Both patients are alive, the one with the relapse of PCNSL has finished rescue therapy and is further observed, the one with gall baldder carcinoma is still under therapy. MMSE and QlQ-C30 showed impressive results in the patients, who were not irradiated. Only one of the irradiated patients is still alive with a clear neurologic deficit but acceptable quality of life.Conclusions: Long-term follow-up of our patients, who were included in the OSHO-53 study show an overall survival of 30 percent. If WBRT can be avoided no long-term neurotoxicity has been observed and the patients benefit from excellent Quality of Life. Induction chemotherapy with two cycles of HD-MTX should be intensified to improve the unsatisfactory OAS of 30 percent.
Treatment intensification and risk factor control: toward more clinically relevant quality measures.
Resumo:
BACKGROUND: Intensification of pharmacotherapy in persons with poorly controlled chronic conditions has been proposed as a clinically meaningful process measure of quality. OBJECTIVE: To validate measures of treatment intensification by evaluating their associations with subsequent control in hypertension, hyperlipidemia, and diabetes mellitus across 35 medical facility populations in Kaiser Permanente, Northern California. DESIGN: Hierarchical analyses of associations of improvements in facility-level treatment intensification rates from 2001 to 2003 with patient-level risk factor levels at the end of 2003. PATIENTS: Members (515,072 and 626,130; age >20 years) with hypertension, hyperlipidemia, and/or diabetes mellitus in 2001 and 2003, respectively. MEASUREMENTS: Treatment intensification for each risk factor defined as an increase in number of drug classes prescribed, of dosage for at least 1 drug, or switching to a drug from another class within 3 months of observed poor risk factor control. RESULTS: Facility-level improvements in treatment intensification rates between 2001 and 2003 were strongly associated with greater likelihood of being in control at the end of 2003 (P < or = 0.05 for each risk factor) after adjustment for patient- and facility-level covariates. Compared with facility rankings based solely on control, addition of percentages of poorly controlled patients who received treatment intensification changed 2003 rankings substantially: 14%, 51%, and 29% of the facilities changed ranks by 5 or more positions for hypertension, hyperlipidemia, and diabetes, respectively. CONCLUSIONS: Treatment intensification is tightly linked to improved control. Thus, it deserves consideration as a process measure for motivating quality improvement and possibly for measuring clinical performance.
Resumo:
Urban transit system performance may be quantified and assessed using transit capacity and productive capacity for planning, design and operational management. Bunker (4) defines important productive performance measures of an individual transit service and transit line. Transit work (p-km) captures transit task performed over distance. Transit productiveness (p-km/h) captures transit work performed over time. This paper applies productive performance with risk assessment to quantify transit system reliability. Theory is developed to monetize transit segment reliability risk on the basis of demonstration Annual Reliability Event rates by transit facility type, segment productiveness, and unit-event severity. A comparative example of peak hour performance of a transit sub-system containing bus-on-street, busway, and rail components in Brisbane, Australia demonstrates through practical application the importance of valuing reliability. Comparison reveals the highest risk segments to be long, highly productive on street bus segments followed by busway (BRT) segments and then rail segments. A transit reliability risk reduction treatment example demonstrates that benefits can be significant and should be incorporated into project evaluation in addition to those of regular travel time savings, reduced emissions and safety improvements. Reliability can be used to identify high risk components of the transit system and draw comparisons between modes both in planning and operations settings, and value improvement scenarios in a project evaluation setting. The methodology can also be applied to inform daily transit system operational management.
Resumo:
Urban transit system performance may be quantified and assessed using transit capacity and productive capacity for planning, design and operational management. Bunker (4) defines important productive performance measures of an individual transit service and transit line. Transit work (p-km) captures transit task performed over distance. Transit productiveness (p-km/h) captures transit work performed over time. This paper applies productive performance with risk assessment to quantify transit system reliability. Theory is developed to monetize transit segment reliability risk on the basis of demonstration Annual Reliability Event rates by transit facility type, segment productiveness, and unit-event severity. A comparative example of peak hour performance of a transit sub-system containing bus-on-street, busway, and rail components in Brisbane, Australia demonstrates through practical application the importance of valuing reliability. Comparison reveals the highest risk segments to be long, highly productive on street bus segments followed by busway (BRT) segments and then rail segments. A transit reliability risk reduction treatment example demonstrates that benefits can be significant and should be incorporated into project evaluation in addition to those of regular travel time savings, reduced emissions and safety improvements. Reliability can be used to identify high risk components of the transit system and draw comparisons between modes both in planning and operations settings, and value improvement scenarios in a project evaluation setting. The methodology can also be applied to inform daily transit system operational management.