66 resultados para CENSORED SURVIVAL-DATA
Resumo:
Purpose: To assess the 5-year survival rate and number of technical, biologic, and esthetic complications involving implant abutments. Materials and Methods: Electronic (Medline) and hand searches were performed to assess studies on metal and ceramic implant abutments. Relevant data from a previous review were included. Two reviewers independently extracted the data. Failure and complication rates were analyzed, and estimates of 5-year survival proportions were calculated from the relationship between event rate and survival function. Multivariable robust Poisson regression was used to compare abutment characteristics. Results: The search yielded 1,558 titles and 274 abstracts. Twenty-four studies were selected for data analysis. The survival rate for ceramic abutments was 97.5% (95% confidence interval [CI]): 89.6% to 99.4%) and 97.6% (95% CI: 96.2% to 98.5%) for metal abutments. The overall 5-year rate for technical complications was 11.8% (95% CI: 8.5% to 16.3%), 8.9% (95% CI: 4.3% to 17.7%) for ceramic and 12.0% (95% CI: 8.5% to 16.8%) for metal abutments. Biologic complications occurred with an overall rate of 6.4% (95% CI: 3.3% to 12.0%), 10.4% (95% CI: 1.9% to 46.7%) for ceramic, and 6.1% (95% CI: 3.1% to 12.0%) for metal abutments. Conclusions: The present meta-analysis on single-implant prostheses presents high survival rates of single implants, abutments, and prostheses after 5 years of function. No differences were found for the survival and failure rates of ceramic and metal abutments. No significant differences were found for technical, biologic, and esthetic complications of internally and externally connected abutments.
Resumo:
BACKGROUND Advanced lower extremity peripheral artery disease (PAD), whether presenting as acute limb ischemia (ALI) or chronic critical limb ischemia (CLI), is associated with high rates of cardiovascular ischemic events, amputation, and death. Past research has focused on strategies of revascularization, but few data are available that prospectively evaluate the impact of key process of care factors (spanning pre-admission, acute hospitalization, and post-discharge) that might contribute to improving short and long-term health outcomes. METHODS/DESIGN The FRIENDS registry is designed to prospectively evaluate a range of patient and health system care delivery factors that might serve as future targets for efforts to improve limb and systemic outcomes for patients with ALI or CLI. This hypothesis-driven registry was designed to evaluate the contributions of: (i) pre-hospital limb ischemia symptom duration, (ii) use of leg revascularization strategies, and (iii) use of risk-reduction pharmacotherapies, as pre-specified factors that may affect amputation-free survival. Sequential patients would be included at an index "vascular specialist-defined" ALI or CLI episode, and patients excluded only for non-vascular etiologies of limb threat. Data including baseline demographics, functional status, co-morbidities, pre-hospital time segments, and use of medical therapies; hospital-based use of revascularization strategies, time segments, and pharmacotherapies; and rates of systemic ischemic events (e.g., myocardial infarction, stroke, hospitalization, and death) and limb ischemic events (e.g., hospitalization for revascularization or amputation) will be recorded during a minimum of one year follow-up. DISCUSSION The FRIENDS registry is designed to evaluate the potential impact of key factors that may contribute to adverse outcomes for patients with ALI or CLI. Definition of new "health system-based" therapeutic targets could then become the focus of future interventional clinical trials for individuals with advanced PAD.
Resumo:
BACKGROUND Recently, two simple clinical scores were published to predict survival in trauma patients. Both scores may successfully guide major trauma triage, but neither has been independently validated in a hospital setting. METHODS This is a cohort study with 30-day mortality as the primary outcome to validate two new trauma scores-Mechanism, Glasgow Coma Scale (GCS), Age, and Pressure (MGAP) score and GCS, Age and Pressure (GAP) score-using data from the UK Trauma Audit and Research Network. First, an assessment of discrimination, using the area under the receiver operating characteristic (ROC) curve, and calibration, comparing mortality rates with those originally published, were performed. Second, we calculated sensitivity, specificity, predictive values, and likelihood ratios for prognostic score performance. Third, we propose new cutoffs for the risk categories. RESULTS A total of 79,807 adult (≥16 years) major trauma patients (2000-2010) were included; 5,474 (6.9%) died. Mean (SD) age was 51.5 (22.4) years, median GCS score was 15 (interquartile range, 15-15), and median Injury Severity Score (ISS) was 9 (interquartile range, 9-16). More than 50% of the patients had a low-risk GAP or MGAP score (1% mortality). With regard to discrimination, areas under the ROC curve were 87.2% for GAP score (95% confidence interval, 86.7-87.7) and 86.8% for MGAP score (95% confidence interval, 86.2-87.3). With regard to calibration, 2,390 (3.3%), 1,900 (28.5%), and 1,184 (72.2%) patients died in the low, medium, and high GAP risk categories, respectively. In the low- and medium-risk groups, these were almost double the previously published rates. For MGAP, 1,861 (2.8%), 1,455 (15.2%), and 2,158 (58.6%) patients died in the low-, medium-, and high-risk categories, consonant with results originally published. Reclassifying score point cutoffs improved likelihood ratios, sensitivity and specificity, as well as areas under the ROC curve. CONCLUSION We found both scores to be valid triage tools to stratify emergency department patients, according to their risk of death. MGAP calibrated better, but GAP slightly improved discrimination. The newly proposed cutoffs better differentiate risk classification and may therefore facilitate hospital resource allocation. LEVEL OF EVIDENCE Prognostic study, level II.
Resumo:
OBJECTIVES This study aimed to update the Logistic Clinical SYNTAX score to predict 3-year survival after percutaneous coronary intervention (PCI) and compare the performance with the SYNTAX score alone. BACKGROUND The SYNTAX score is a well-established angiographic tool to predict long-term outcomes after PCI. The Logistic Clinical SYNTAX score, developed by combining clinical variables with the anatomic SYNTAX score, has been shown to perform better than the SYNTAX score alone in predicting 1-year outcomes after PCI. However, the ability of this score to predict long-term survival is unknown. METHODS Patient-level data (N = 6,304, 399 deaths within 3 years) from 7 contemporary PCI trials were analyzed. We revised the overall risk and the predictor effects in the core model (SYNTAX score, age, creatinine clearance, and left ventricular ejection fraction) using Cox regression analysis to predict mortality at 3 years. We also updated the extended model by combining the core model with additional independent predictors of 3-year mortality (i.e., diabetes mellitus, peripheral vascular disease, and body mass index). RESULTS The revised Logistic Clinical SYNTAX models showed better discriminative ability than the anatomic SYNTAX score for the prediction of 3-year mortality after PCI (c-index: SYNTAX score, 0.61; core model, 0.71; and extended model, 0.73 in a cross-validation procedure). The extended model in particular performed better in differentiating low- and intermediate-risk groups. CONCLUSIONS Risk scores combining clinical characteristics with the anatomic SYNTAX score substantially better predict 3-year mortality than the SYNTAX score alone and should be used for long-term risk stratification of patients undergoing PCI.
Resumo:
BACKGROUND Recently, it has been suggested that the type of stent used in primary percutaneous coronary interventions (pPCI) might impact upon the outcomes of patients with acute myocardial infarction (AMI). Indeed, drug-eluting stents (DES) reduce neointimal hyperplasia compared to bare-metal stents (BMS). Moreover, the later generation DES, due to its biocompatible polymer coatings and stent design, allows for greater deliverability, improved endothelial healing and therefore less restenosis and thrombus generation. However, data on the safety and performance of DES in large cohorts of AMI is still limited. AIM To compare the early outcome of DES vs. BMS in AMI patients. METHODS This was a prospective, multicentre analysis containing patients from 64 hospitals in Switzerland with AMI undergoing pPCI between 2005 and 2013. The primary endpoint was in-hospital all-cause death, whereas the secondary endpoint included a composite measure of major adverse cardiac and cerebrovascular events (MACCE) of death, reinfarction, and cerebrovascular event. RESULTS Of 20,464 patients with a primary diagnosis of AMI and enrolled to the AMIS Plus registry, 15,026 were referred for pPCI and 13,442 received stent implantation. 10,094 patients were implanted with DES and 2,260 with BMS. The overall in-hospital mortality was significantly lower in patients with DES compared to those with BMS implantation (2.6% vs. 7.1%,p < 0.001). The overall in-hospital MACCE after DES was similarly lower compared to BMS (3.5% vs. 7.6%, p < 0.001). After adjusting for all confounding covariables, DES remained an independent predictor for lower in-hospital mortality (OR 0.51,95% CI 0.40-0.67, p < 0.001). Since groups differed as regards to baseline characteristics and pharmacological treatment, we performed a propensity score matching (PSM) to limit potential biases. Even after the PSM, DES implantation remained independently associated with a reduced risk of in-hospital mortality (adjusted OR 0.54, 95% CI 0.39-0.76, p < 0.001). CONCLUSIONS In unselected patients from a nationwide, real-world cohort, we found DES, compared to BMS, was associated with lower in-hospital mortality and MACCE. The identification of optimal treatment strategies of patients with AMI needs further randomised evaluation; however, our findings suggest a potential benefit with DES.
Resumo:
The phosphoinositide 3-kinase (PI3K)/Akt/mammalian target of rapamycin (mTOR) pathway is frequently activated in human cancer and plays a crucial role in glioblastoma biology. We were interested in gaining further insight into the potential of targeting PI3K isoforms as a novel anti-tumor approach in glioblastoma. Consistent expression of the PI3K catalytic isoform PI3K p110α was detected in a panel of glioblastoma patient samples. In contrast, PI3K p110β expression was only rarely detected in glioblastoma patient samples. The expression of a module comprising the epidermal growth factor receptor (EGFR)/PI3K p110α/phosphorylated ribosomal S6 protein (p-S6) was correlated with shorter patient survival. Inhibition of PI3K p110α activity impaired the anchorage-dependent growth of glioblastoma cells and induced tumor regression in vivo. Inhibition of PI3K p110α or PI3K p110β also led to impaired anchorage-independent growth, a decreased migratory capacity of glioblastoma cells, and reduced the activation of the Akt/mTOR pathway. These effects were selective, because targeting of PI3K p110δ did not result in a comparable impairment of glioblastoma tumorigenic properties. Together, our data reveal that drugs targeting PI3K p110α can reduce growth in a subset of glioblastoma tumors characterized by the expression of EGFR/PI3K p110α/p-S6.
Resumo:
BACKGROUND The population-based effectiveness of thoracic endovascular aortic repair (TEVAR) versus open surgery for descending thoracic aortic aneurysm remains in doubt. METHODS Patients aged over 50 years, without a history of aortic dissection, undergoing repair of a thoracic aortic aneurysm between 2006 and 2011 were assessed using mortality-linked individual patient data from Hospital Episode Statistics (England). The principal outcomes were 30-day operative mortality, long-term survival (5 years) and aortic-related reinterventions. TEVAR and open repair were compared using crude and multivariable models that adjusted for age and sex. RESULTS Overall, 759 patients underwent thoracic aortic aneurysm repair, mainly for intact aneurysms (618, 81·4 per cent). Median ages of TEVAR and open cohorts were 73 and 71 years respectively (P < 0·001), with more men undergoing TEVAR (P = 0·004). For intact aneurysms, the operative mortality rate was similar for TEVAR and open repair (6·5 versus 7·6 per cent; odds ratio 0·79, 95 per cent confidence interval (c.i.) 0·41 to 1·49), but the 5-year survival rate was significantly worse after TEVAR (54·2 versus 65·6 per cent; adjusted hazard ratio 1·45, 95 per cent c.i. 1·08 to 1·94). After 5 years, aortic-related mortality was similar in the two groups, but cardiopulmonary mortality was higher after TEVAR. TEVAR was associated with more aortic-related reinterventions (23·1 versus 14·3 per cent; adjusted HR 1·70, 95 per cent c.i. 1·11 to 2·60). There were 141 procedures for ruptured thoracic aneurysm (97 TEVAR, 44 open), with TEVAR showing no significant advantage in terms of operative mortality. CONCLUSION In England, operative mortality for degenerative descending thoracic aneurysm was similar after either TEVAR or open repair. Patients who had TEVAR appeared to have a higher reintervention rate and worse long-term survival, possibly owing to cardiopulmonary morbidity and other selection bias.
Resumo:
INTRODUCTION Even though arthroplasty of the ankle joint is considered to be an established procedure, only about 1,300 endoprostheses are implanted in Germany annually. Arthrodeses of the ankle joint are performed almost three times more often. This may be due to the availability of the procedure - more than twice as many providers perform arthrodesis - as well as the postulated high frequency of revision procedures of arthroplasties in the literature. In those publications, however, there is often no clear differentiation between revision surgery with exchange of components, subsequent interventions due to complications and subsequent surgery not associated with complications. The German Orthopaedic Foot and Ankle Association's (D. A. F.) registry for total ankle replacement collects data pertaining to perioperative complications as well as cause, nature and extent of the subsequent interventions, and postoperative patient satisfaction. MATERIAL AND METHODS The D. A. F.'s total ankle replacement register is a nation-wide, voluntary registry. After giving written informed consent, the patients can be added to the database by participating providers. Data are collected during hospital stay for surgical treatment, during routine follow-up inspections and in the context of revision surgery. The information can be submitted in paper-based or online formats. The survey instruments are available as minimum data sets or scientific questionnaires which include patient-reported outcome measures (PROMs). The pseudonymous clinical data are collected and evaluated at the Institute for Evaluative Research in Medicine, University of Bern/Switzerland (IEFM). The patient-related data remain on the register's module server in North Rhine-Westphalia, Germany. The registry's methodology as well as the results of the revisions and patient satisfaction for 115 patients with a two year follow-up period are presented. Statistical analyses are performed with SAS™ (Version 9.4, SAS Institute, Inc., Cary, NC, USA). RESULTS About 2½ years after the register was launched there are 621 datasets on primary implantations, 1,427 on follow-ups and 121 records on re-operation available. 49 % of the patients received their implants due to post-traumatic osteoarthritis, 27 % because of a primary osteoarthritis and 15 % of patients suffered from a rheumatic disease. More than 90 % of the primary interventions proceeded without complications. Subsequent interventions were recorded for 84 patients, which corresponds to a rate of 13.5 % with respect to the primary implantations. It should be noted that these secondary procedures also include two-stage procedures not due to a complication. "True revisions" are interventions with exchange of components due to mechanical complications and/or infection and were present in 7.6 % of patients. 415 of the patients commented on their satisfaction with the operative result during the last follow-up: 89.9 % of patients evaluate their outcome as excellent or good, 9.4 % as moderate and only 0.7 % (3 patients) as poor. In these three cases a component loosening or symptomatic USG osteoarthritis was present. Two-year follow-up data using the American Orthopedic Foot and Ankle Society Ankle and Hindfoot Scale (AOFAS-AHS) are already available for 115 patients. The median AOFAS-AHS score increased from 33 points preoperatively to more than 80 points three to six months postoperatively. This increase remained nearly constant over the entire two-year follow-up period. CONCLUSION Covering less than 10 % of the approximately 240 providers in Germany and approximately 12 % of the annually implanted total ankle-replacements, the D. A. F.-register is still far from being seen as a national registry. Nevertheless, geographical coverage and inclusion of "high-" (more than 100 total ankle replacements a year) and "low-volume surgeons" (less than 5 total ankle replacements a year) make the register representative for Germany. The registry data show that the number of subsequent interventions and in particular the "true revision" procedures are markedly lower than the 20 % often postulated in the literature. In addition, a high level of patient satisfaction over the short and medium term is recorded. From the perspective of the authors, these results indicate that total ankle arthroplasty - given a correct indication and appropriate selection of patients - is not inferior to an ankle arthrodesis concerning patients' satisfaction and function. First valid survival rates can be expected about 10 years after the register's start.
Resumo:
Tyrosine kinase inhibitors (TKI) have changed the natural course of chronic myeloid leukemia (CML). With the advent of second-generation TKI safety and efficacy issues have gained interest. The randomized CML - Study IV was used for a long-term evaluation of imatinib (IM). 1503 patients have received IM, 1379 IM monotherapy. After a median observation of 7.1 years, 965 patients (64%) still received IM. At 10 years, progression-free survival was 82%, overall survival 84%, 59% achieved MR(5), 72% MR(4.5), 81% MR(4), 89% major molecular remission and 92% MR(2) (molecular equivalent to complete cytogenetic remission). All response levels were reached faster with IM800 mg except MR(5). Eight-year probabilities of adverse drug reactions (ADR) were 76%, of grades 3-4 22%, of non-hematologic 73%, and of hematologic 28%. More ADR were observed with IM800 mg and IM400 mg plus interferon α (IFN). Most patients had their first ADR early with decreasing frequency later on. No new late toxicity was observed. ADR to IM are frequent, but mostly mild and manageable, also with IM 800 mg and IM 400 mg+IFN. The deep molecular response rates indicate that most patients are candidates for IM discontinuation. After 10 years, IM continues to be an excellent initial choice for most patients with CML.Leukemia advance online publication, 13 March 2015; doi:10.1038/leu.2015.36.
Resumo:
BACKGROUND Potentially avoidable risk factors continue to cause unnecessary disability and premature death in older people. Health risk assessment (HRA), a method successfully used in working-age populations, is a promising method for cost-effective health promotion and preventive care in older individuals, but the long-term effects of this approach are unknown. The objective of this study was to evaluate the effects of an innovative approach to HRA and counselling in older individuals for health behaviours, preventive care, and long-term survival. METHODS AND FINDINGS This study was a pragmatic, single-centre randomised controlled clinical trial in community-dwelling individuals aged 65 y or older registered with one of 19 primary care physician (PCP) practices in a mixed rural and urban area in Switzerland. From November 2000 to January 2002, 874 participants were randomly allocated to the intervention and 1,410 to usual care. The intervention consisted of HRA based on self-administered questionnaires and individualised computer-generated feedback reports, combined with nurse and PCP counselling over a 2-y period. Primary outcomes were health behaviours and preventive care use at 2 y and all-cause mortality at 8 y. At baseline, participants in the intervention group had a mean ± standard deviation of 6.9 ± 3.7 risk factors (including unfavourable health behaviours, health and functional impairments, and social risk factors) and 4.3 ± 1.8 deficits in recommended preventive care. At 2 y, favourable health behaviours and use of preventive care were more frequent in the intervention than in the control group (based on z-statistics from generalised estimating equation models). For example, 70% compared to 62% were physically active (odds ratio 1.43, 95% CI 1.16-1.77, p = 0.001), and 66% compared to 59% had influenza vaccinations in the past year (odds ratio 1.35, 95% CI 1.09-1.66, p = 0.005). At 8 y, based on an intention-to-treat analysis, the estimated proportion alive was 77.9% in the intervention and 72.8% in the control group, for an absolute mortality difference of 4.9% (95% CI 1.3%-8.5%, p = 0.009; based on z-test for risk difference). The hazard ratio of death comparing intervention with control was 0.79 (95% CI 0.66-0.94, p = 0.009; based on Wald test from Cox regression model), and the number needed to receive the intervention to prevent one death was 21 (95% CI 12-79). The main limitations of the study include the single-site study design, the use of a brief self-administered questionnaire for 2-y outcome data collection, the unavailability of other long-term outcome data (e.g., functional status, nursing home admissions), and the availability of long-term follow-up data on mortality for analysis only in 2014. CONCLUSIONS This is the first trial to our knowledge demonstrating that a collaborative care model of HRA in community-dwelling older people not only results in better health behaviours and increased use of recommended preventive care interventions, but also improves survival. The intervention tested in our study may serve as a model of how to implement a relatively low-cost but effective programme of disease prevention and health promotion in older individuals. TRIAL REGISTRATION International Standard Randomized Controlled Trial Number: ISRCTN 28458424.
Resumo:
Adaptation potential of forests to rapid climatic changes can be assessed from vegetation dynamics during past climatic changes as preserved in fossil pollen data. However, pollen data reflect the integrated effects of climate and biotic processes, such as establishment, survival, competition, and migration. To disentangle these processes, we compared an annually laminated late Würm and Holocene pollen record from the Central Swiss Plateau with simulations of a dynamic forest patch model. All input data used in the simulations were largely independent from pollen data; i.e. the presented analysis is non-circular. Temperature and precipitation scenarios were based on reconstructions from pollen-independent sources. The earliest arrival times of the species at the study site after the last glacial were inferred from pollen maps. We ran a series of simulations under different combinations of climate and immigration scenarios. In addition, the sensitivity of the simulated presence/absence of four major species to changes in the climate scenario was examined. The pattern of the pollen record could partly be explained by the used climate scenario, mostly by temperature. However, some features, in particular the absence of most species during the late Würm could only be simulated if the winter temperature anomalies of the used scenario were decreased considerably. Consequently, we had to assume in the simulations, that most species immigrated during or after the Younger Dryas (12 000 years BP), Abies and Fagus even later. Given the timing of tree species immigration, the vegetation was in equilibrium with climate during long periods, but responded with lags at the time-scale of centuries to millennia caused by a secondary succession after rapid climatic changes such as at the end of Younger Dryas, or immigration of dominant taxa. Climate influenced the tree taxa both directly and indirectly by changing inter-specific competition. We concluded, that also during the present fast climatic change, species migration might be an important process, particularly if geographic barriers, such as the Alps are in the migrational path.
Resumo:
BACKGROUND Febrile neutropenia (FN) and other infectious complications are some of the most serious treatment-related toxicities of chemotherapy for cancer, with a mortality rate of 2% to 21%. The two main types of prophylactic regimens are granulocyte (macrophage) colony-stimulating factors (G(M)-CSF) and antibiotics, frequently quinolones or cotrimoxazole. Current guidelines recommend the use of colony-stimulating factors when the risk of febrile neutropenia is above 20%, but they do not mention the use of antibiotics. However, both regimens have been shown to reduce the incidence of infections. Since no systematic review has compared the two regimens, a systematic review was undertaken. OBJECTIVES To compare the efficacy and safety of G(M)-CSF compared to antibiotics in cancer patients receiving myelotoxic chemotherapy. SEARCH METHODS We searched The Cochrane Library, MEDLINE, EMBASE, databases of ongoing trials, and conference proceedings of the American Society of Clinical Oncology and the American Society of Hematology (1980 to December 2015). We planned to include both full-text and abstract publications. Two review authors independently screened search results. SELECTION CRITERIA We included randomised controlled trials (RCTs) comparing prophylaxis with G(M)-CSF versus antibiotics for the prevention of infection in cancer patients of all ages receiving chemotherapy. All study arms had to receive identical chemotherapy regimes and other supportive care. We included full-text, abstracts, and unpublished data if sufficient information on study design, participant characteristics, interventions and outcomes was available. We excluded cross-over trials, quasi-randomised trials and post-hoc retrospective trials. DATA COLLECTION AND ANALYSIS Two review authors independently screened the results of the search strategies, extracted data, assessed risk of bias, and analysed data according to standard Cochrane methods. We did final interpretation together with an experienced clinician. MAIN RESULTS In this updated review, we included no new randomised controlled trials. We included two trials in the review, one with 40 breast cancer patients receiving high-dose chemotherapy and G-CSF compared to antibiotics, a second one evaluating 155 patients with small-cell lung cancer receiving GM-CSF or antibiotics.We judge the overall risk of bias as high in the G-CSF trial, as neither patients nor physicians were blinded and not all included patients were analysed as randomised (7 out of 40 patients). We considered the overall risk of bias in the GM-CSF to be moderate, because of the risk of performance bias (neither patients nor personnel were blinded), but low risk of selection and attrition bias.For the trial comparing G-CSF to antibiotics, all cause mortality was not reported. There was no evidence of a difference for infection-related mortality, with zero events in each arm. Microbiologically or clinically documented infections, severe infections, quality of life, and adverse events were not reported. There was no evidence of a difference in frequency of febrile neutropenia (risk ratio (RR) 1.22; 95% confidence interval (CI) 0.53 to 2.84). The quality of the evidence for the two reported outcomes, infection-related mortality and frequency of febrile neutropenia, was very low, due to the low number of patients evaluated (high imprecision) and the high risk of bias.There was no evidence of a difference in terms of median survival time in the trial comparing GM-CSF and antibiotics. Two-year survival times were 6% (0 to 12%) in both arms (high imprecision, low quality of evidence). There were four toxic deaths in the GM-CSF arm and three in the antibiotics arm (3.8%), without evidence of a difference (RR 1.32; 95% CI 0.30 to 5.69; P = 0.71; low quality of evidence). There were 28% grade III or IV infections in the GM-CSF arm and 18% in the antibiotics arm, without any evidence of a difference (RR 1.55; 95% CI 0.86 to 2.80; P = 0.15, low quality of evidence). There were 5 episodes out of 360 cycles of grade IV infections in the GM-CSF arm and 3 episodes out of 334 cycles in the cotrimoxazole arm (0.8%), with no evidence of a difference (RR 1.55; 95% CI 0.37 to 6.42; P = 0.55; low quality of evidence). There was no significant difference between the two arms for non-haematological toxicities like diarrhoea, stomatitis, infections, neurologic, respiratory, or cardiac adverse events. Grade III and IV thrombopenia occurred significantly more frequently in the GM-CSF arm (60.8%) compared to the antibiotics arm (28.9%); (RR 2.10; 95% CI 1.41 to 3.12; P = 0.0002; low quality of evidence). Neither infection-related mortality, incidence of febrile neutropenia, nor quality of life were reported in this trial. AUTHORS' CONCLUSIONS As we only found two small trials with 195 patients altogether, no conclusion for clinical practice is possible. More trials are necessary to assess the benefits and harms of G(M)-CSF compared to antibiotics for infection prevention in cancer patients receiving chemotherapy.
Resumo:
OBJECTIVE: To evaluate serum concentrations of biochemical markers and survival time in dogs with protein-losing enteropathy (PLE). DESIGN: Prospective study. ANIMALS: 29 dogs with PLE and 18 dogs with food-responsive diarrhea (FRD). PROCEDURES: Data regarding serum concentrations of various biochemical markers at the initial evaluation were available for 18 of the 29 dogs with PLE and compared with findings for dogs with FRD. Correlations between biochemical marker concentrations and survival time (interval between time of initial evaluation and death or euthanasia) for dogs with PLE were evaluated. RESULTS: Serum C-reactive protein concentration was high in 13 of 18 dogs with PLE and in 2 of 18 dogs with FRD. Serum concentration of canine pancreatic lipase immunoreactivity was high in 3 dogs with PLE but within the reference interval in all dogs with FRD. Serum α1-proteinase inhibitor concentration was less than the lower reference limit in 9 dogs with PLE and 1 dog with FRD. Compared with findings in dogs with FRD, values of those 3 variables in dogs with PLE were significantly different. Serum calprotectin (measured by radioimmunoassay and ELISA) and S100A12 concentrations were high but did not differ significantly between groups. Seventeen of the 29 dogs with PLE were euthanized owing to this disease; median survival time was 67 days (range, 2 to 2,551 days). CONCLUSIONS AND CLINICAL RELEVANCE: Serum C-reactive protein, canine pancreatic lipase immunoreactivity, and α1-proteinase inhibitor concentrations differed significantly between dogs with PLE and FRD. Most initial biomarker concentrations were not predictive of survival time in dogs with PLE.
Resumo:
BACKGROUND Long-term hormone therapy has been the standard of care for advanced prostate cancer since the 1940s. STAMPEDE is a randomised controlled trial using a multiarm, multistage platform design. It recruits men with high-risk, locally advanced, metastatic or recurrent prostate cancer who are starting first-line long-term hormone therapy. We report primary survival results for three research comparisons testing the addition of zoledronic acid, docetaxel, or their combination to standard of care versus standard of care alone. METHODS Standard of care was hormone therapy for at least 2 years; radiotherapy was encouraged for men with N0M0 disease to November, 2011, then mandated; radiotherapy was optional for men with node-positive non-metastatic (N+M0) disease. Stratified randomisation (via minimisation) allocated men 2:1:1:1 to standard of care only (SOC-only; control), standard of care plus zoledronic acid (SOC + ZA), standard of care plus docetaxel (SOC + Doc), or standard of care with both zoledronic acid and docetaxel (SOC + ZA + Doc). Zoledronic acid (4 mg) was given for six 3-weekly cycles, then 4-weekly until 2 years, and docetaxel (75 mg/m(2)) for six 3-weekly cycles with prednisolone 10 mg daily. There was no blinding to treatment allocation. The primary outcome measure was overall survival. Pairwise comparisons of research versus control had 90% power at 2·5% one-sided α for hazard ratio (HR) 0·75, requiring roughly 400 control arm deaths. Statistical analyses were undertaken with standard log-rank-type methods for time-to-event data, with hazard ratios (HRs) and 95% CIs derived from adjusted Cox models. This trial is registered at ClinicalTrials.gov (NCT00268476) and ControlledTrials.com (ISRCTN78818544). FINDINGS 2962 men were randomly assigned to four groups between Oct 5, 2005, and March 31, 2013. Median age was 65 years (IQR 60-71). 1817 (61%) men had M+ disease, 448 (15%) had N+/X M0, and 697 (24%) had N0M0. 165 (6%) men were previously treated with local therapy, and median prostate-specific antigen was 65 ng/mL (IQR 23-184). Median follow-up was 43 months (IQR 30-60). There were 415 deaths in the control group (347 [84%] prostate cancer). Median overall survival was 71 months (IQR 32 to not reached) for SOC-only, not reached (32 to not reached) for SOC + ZA (HR 0·94, 95% CI 0·79-1·11; p=0·450), 81 months (41 to not reached) for SOC + Doc (0·78, 0·66-0·93; p=0·006), and 76 months (39 to not reached) for SOC + ZA + Doc (0·82, 0·69-0·97; p=0·022). There was no evidence of heterogeneity in treatment effect (for any of the treatments) across prespecified subsets. Grade 3-5 adverse events were reported for 399 (32%) patients receiving SOC, 197 (32%) receiving SOC + ZA, 288 (52%) receiving SOC + Doc, and 269 (52%) receiving SOC + ZA + Doc. INTERPRETATION Zoledronic acid showed no evidence of survival improvement and should not be part of standard of care for this population. Docetaxel chemotherapy, given at the time of long-term hormone therapy initiation, showed evidence of improved survival accompanied by an increase in adverse events. Docetaxel treatment should become part of standard of care for adequately fit men commencing long-term hormone therapy. FUNDING Cancer Research UK, Medical Research Council, Novartis, Sanofi-Aventis, Pfizer, Janssen, Astellas, NIHR Clinical Research Network, Swiss Group for Clinical Cancer Research.
Resumo:
BACKGROUND Kidney recipients maintaining a prolonged allograft survival in the absence of immunosuppressive drugs and without evidence of rejection are supposed to be exceptional. The ERA-EDTA-DESCARTES working group together with Nantes University launched a European-wide survey to identify new patients, describe them and estimate their frequency for the first time. METHODS Seventeen coordinators distributed a questionnaire in 256 transplant centres and 28 countries in order to report as many 'operationally tolerant' patients (TOL; defined as having a serum creatinine <1.7 mg/dL and proteinuria <1 g/day or g/g creatinine despite at least 1 year without any immunosuppressive drug) and 'almost tolerant' patients (minimally immunosuppressed patients (MIS) receiving low-dose steroids) as possible. We reported their number and the total number of kidney transplants performed at each centre to calculate their frequency. RESULTS One hundred and forty-seven questionnaires were returned and we identified 66 TOL (61 with complete data) and 34 MIS patients. Of the 61 TOL patients, 26 were previously described by the Nantes group and 35 new patients are presented here. Most of them were noncompliant patients. At data collection, 31/35 patients were alive and 22/31 still TOL. For the remaining 9/31, 2 were restarted on immunosuppressive drugs and 7 had rising creatinine of whom 3 resumed dialysis. Considering all patients, 10-year death-censored graft survival post-immunosuppression weaning reached 85% in TOL patients and 100% in MIS patients. With 218 913 kidney recipients surveyed, cumulative incidences of operational tolerance and almost tolerance were estimated at 3 and 1.5 per 10 000 kidney recipients, respectively. CONCLUSIONS In kidney transplantation, operational tolerance and almost tolerance are infrequent findings associated with excellent long-term death-censored graft survival.