924 resultados para Equipment Failure Analysis
Resumo:
To systematically investigate putative causes of non-coronary high-sensitive troponin elevations in patients presenting to a tertiary care emergency department. In this cross-sectional analysis, patients who received serial measurements of high-sensitive troponin T between 1 August 2010 and 31 October 2012 at the Department of Emergency Medicine were included. The following putative causes were considered to be associated with non-acute coronary syndrome-related increases in high-sensitive troponin T: acute pulmonary embolism, renal insufficiency, aortic dissection, heart failure, peri-/myocarditis, strenuous exercise, rhabdomyolysis, cardiotoxic chemotherapy, high-frequency ablation therapy, defibrillator shocks, cardiac infiltrative disorders (e.g., amyloidosis), chest trauma, sepsis, shock, exacerbation of chronic obstructive pulmonary disease, and diabetic ketoacidosis. During the study period a total of 1,573 patients received serial measurements of high-sensitive troponin T. Of these, 175 patients were found to have acute coronary syndrome leaving 1,398 patients for inclusion in the study. In 222 (30 %) of patients, no putative cause described in the literature could be attributed to the elevation in high-sensitive troponin T observed. The most commonly encountered mechanism underlying the troponin T elevation was renal insufficiency that was present in 286 patients (57 %), followed by cerebral ischemia in 95 patients (19 %), trauma in 75 patients (15 %) and heart failure in 41 patients (8 %). Non-acute coronary syndrome-associated elevation of high-sensitive troponin T levels is commonly observed in the emergency department. Renal insufficiency and acute cerebral events are the most common conditions associated with high-sensitive troponin T elevation.
Resumo:
Currently, there are no molecular biomarkers that guide treatment decisions for patients with head and neck squamous cell carcinoma (HNSCC). Several retrospective studies have evaluated TP53 in HNSCC, and results have suggested that specific mutations are associated with poor outcome. However, there exists heterogeneity among these studies in the site and stage of disease of the patients reviewed, the treatments rendered, and methods of evaluating TP53 mutation. Thus, it remains unclear as to which patients and in which clinical settings TP53 mutation is most useful in predicting treatment failure. In the current study, we reviewed the records of a cohort of patients with advanced, resectable HNSCC who received surgery and post-operative radiation (PORT) and had DNA isolated from fresh tumor tissue obtained at the time of surgery. TP53 mutations were identified using Sanger sequencing of exons 2-11 and the associated splice regions of the TP53 gene. We have found that the group of patients with either non-disruptive or disruptive TP53 mutations had decreased overall survival, disease-free survival, and an increased rate of distant metastasis. When examined as an independent factor, disruptive mutation was strongly associated with the development of distant metastasis. As a second aim of this project, we performed a pilot study examining the utility of the AmpliChip® p53 test as a practical method for TP53 sequencing in the clinical setting. AmpliChip® testing and Sanger sequencing was performed on a separate cohort of patients with HNSCC. Our study demonstrated the ablity of the AmpliChip® to call TP53 mutation from a single formalin-fixed paraffin-embedded slide. The results from AmpliChip® testing were identical with the Sanger method in 11 of 19 cases, with a higher rate of mutation calls using the AmpliChip® test. TP53 mutation is a potential prognostic biomarker among patients with advanced, resectable HNSCC treated with surgery and PORT. Whether this subgroup of patients could benefit from the addition of concurrent or induction chemotherapy remains to be evaluated in prospective clinical trials. Our pilot study of the p53 AmpliChip® suggests this could be a practical and reliable method of TP53 analysis in the clinical setting.
Resumo:
Treatment of mice with the immunomodulating agent, Corynebacterium parvum (C. parvum), was shown to result in a severe and long-lasting depression of splenic natural killer (NK) cell-mediated cytotoxicity 5-21 days post-inoculation. Because NK cells have been implicated in immunosurveillance against malignancy (due to their spontaneous occurrence and rapid reactivity to a variety of histological types of tumors), as well as in resistance to established tumors, this decreased activity was of particular concern, since this effect is contrary to that which would be considered therapeutically desirable in cancer treatment (i.e. a potentiation of antitumor effector functions, including NK cell activity, would be expected to lead to a more effective destruction of malignant cells). Therefore, an analysis of the mechanism of this decline of splenic NK cell activity in C.parvum treated mice was undertaken.^ From in vitro co-culturing experiments, it was found that low NK-responsive C. parvum splenocytes were capable of reducing the normally high-reactivity of cells from untreated syngeneic mice to YAC-1 lymphoma, suggesting the presence of NK-directed suppressor cells in C. parvum treated animals. This was further supported by the demonstration of normal levels of cytotoxicity in C. parvum splenocyte preparations following Ficoll-Hypaque separation, which coincided with removal of the NK-suppressive capabilities of these cells. The T cell nature of these regulatory cells was indicated by (1) the failure of C. parvum to cause a reduction of NK cell activity, or the generation of NK-directed suppressor cells in T cell-deficient athymic mice, (2) the removal of C. parvum-induced suppression by T cell-depleting fractionation procedures or treatments, and (3) demonstration of suppression of NK cell activity by T cell-enriched C. parvum splenocytes. These studies suggest, therefore, that the eventual reduction of suppression by T cell elimination and/or inhibition, may result in a promotion of the antitumor effectiveness of C. parvum due to the contribution of "freed" NK effector cell activity.^ However, the temporary suppression of NK cell activity induced by C. parvum (reactivity of treated mice returns to normal levels within 28 days after C. parvum injection), may in fact be favorable in some situations, e.g. in bone marrow transplantation cases, since NK cells have been suggested to play a role also in the process of bone marrow graft rejection.^ Therefore, the discriminate use of agents such as C. parvum may allow for the controlled regulation of NK cell activity suggested to be necessary for the optimalization of therapeutic regimens. ^
Resumo:
BACKGROUND Type D (distressed) personality, the conjoint effect of negative affectivity (NA) and social inhibition (SI), predicts adverse cardiovascular outcomes, and is assessed with the 14-item Type D Scale (DS14). However, potential cross-cultural differences in Type D have not been examined yet in a direct comparison of countries. AIM To examine the cross-cultural validity of the Type D construct and its relation with cardiovascular risk factors, cardiac symptom severity, and depression/anxiety. METHODS In 22 countries, 6222 patients with ischemic heart disease (angina, 33%; myocardial infarction, 37%; or heart failure, 30%) completed the DS14 as part of the International HeartQoL Project. RESULTS Type D personality was assessed reliably across countries (αNA>.80; αSI>.74; except Russia, which was excluded from further analysis). Cross-cultural measurement equivalence was established for Type D personality at all measurement levels, as the factor-item configuration, factor loadings, and error structure were not different across countries (fit: CFI=.91; NFI=.88; RMSEA=.018), as well as across gender and diagnostic subgroups. Type D personality was more prevalent in Southern (37%) and Eastern (35%) European countries compared to Northern (24%) and Western European and English-speaking (both 27%) countries (p<.001). Type D was not confounded by cardiac symptom severity, but was associated with a higher prevalence of hypertension, smoking, sedentary lifestyle, and depression. CONCLUSION Cross-cultural measurement equivalence was demonstrated for the Type D scale in 21 countries. There is a pan-cultural relationship between Type D personality and some cardiovascular risk factors, supporting the role of Type D personality across countries and cardiac conditions.
Resumo:
OBJECTIVES It is still debated if pre-existing minority drug-resistant HIV-1 variants (MVs) affect the virological outcomes of first-line NNRTI-containing ART. METHODS This Europe-wide case-control study included ART-naive subjects infected with drug-susceptible HIV-1 as revealed by population sequencing, who achieved virological suppression on first-line ART including one NNRTI. Cases experienced virological failure and controls were subjects from the same cohort whose viraemia remained suppressed at a matched time since initiation of ART. Blinded, centralized 454 pyrosequencing with parallel bioinformatic analysis in two laboratories was used to identify MVs in the 1%-25% frequency range. ORs of virological failure according to MV detection were estimated by logistic regression. RESULTS Two hundred and sixty samples (76 cases and 184 controls), mostly subtype B (73.5%), were used for the analysis. Identical MVs were detected in the two laboratories. 31.6% of cases and 16.8% of controls harboured pre-existing MVs. Detection of at least one MV versus no MVs was associated with an increased risk of virological failure (OR = 2.75, 95% CI = 1.35-5.60, P = 0.005); similar associations were observed for at least one MV versus no NRTI MVs (OR = 2.27, 95% CI = 0.76-6.77, P = 0.140) and at least one MV versus no NNRTI MVs (OR = 2.41, 95% CI = 1.12-5.18, P = 0.024). A dose-effect relationship between virological failure and mutational load was found. CONCLUSIONS Pre-existing MVs more than double the risk of virological failure to first-line NNRTI-based ART.
Resumo:
Various assays have been used as an aid to diagnose failure of passive transfer (FPT) of immunoglobulins in neonatal foals, but often lack sensitivity as screening tests, or are time consuming to perform and impractical as confirmatory tests. The aim of the present study was to evaluate whether measurement of serum total globulins (TG; i.e. total protein minus albumin) can be used to estimate the electrophoretic gamma globulin (EGG) fraction in hospitalised neonatal foals with suspected FPT. Sample data from 56 foals were evaluated retrospectively. The coefficient of rank correlation was 0.84. The area under the curve of ROC analysis was 0.887, 0.922 and 0.930 for EGG concentrations <2 g/L, < 4 g/L and <8 g/L, respectively. Cut-offs for TG achieved ≥90% sensitivity for detecting EGG <2 g/L, < 4 g/L and <8 g/L, with negative predictive values of >97% and >94%, using prevalence of 15% and 30%, respectively. These results suggest that measurement of TG can be used as a guide to predicting EGG, provided that appropriate cut-off values are selected, and this technique could be a useful initial screening test for FPT in foals.
Resumo:
PURPOSE Clinical studies related to the long-term outcomes with implant-supported reconstructions are still sparse. The aim of this 10-year retrospective study was to assess the rate of mechanical/technical complications and failures with implant supported fixed dental prostheses (FDPs) and single crowns (SCs) in a large cohort of partially edentulous patients. MATERIALS AND METHODS The comprehensive multidisciplinary examination consisted of a medical/dental history, clinical examination, and a radiographic analysis. Prosthodontic examination evaluated the implant-supported reconstructions for mechanical/technical complications and failures, occlusal analysis, presence/absence of attrition, and location, extension, and retention type. RESULTS Out of three hundred ninety seven fixed reconstructions in three hundred three patients, two hundred sixty eight were SCs and one hundred twenty seven were FDPs. Of these three hundred ninety seven implant-supported reconstructions, 18 had failed, yielding a failure rate of 4.5% and a survival rate of 95.5% after a mean observation period of 10.75 years (range: 8.4-13.5 years). The most frequent complication was ceramic chipping (20.31%) followed by occlusal screw loosening (2.57%) and loss of retention (2.06%). No occlusal screw fracture, one abutment loosening, and two abutment fractures were noted. This resulted in a total mechanical/technical complication rate of 24.7%. The prosthetic success rate over a mean follow-up time of 10.75 years was 70.8%. Generalized attrition and FDPs were associated with statistically significantly higher rates of ceramic fractures when compared with SCs. Cantilever extensions, screw retention, anterior versus posterior, and gender did not influence the chipping rate. CONCLUSIONS After a mean exposure time of 10.75 years, high survival rates for reconstructions supported by Sand-blasted Large-grit Acid-etched implants can be expected. Ceramic chipping was the most frequent complication and was increased in dentitions with attrition and in FDPs compared with SCs.
Resumo:
Therapeutic resistance remains the principal problem in acute myeloid leukemia (AML). We used area under receiver-operating characteristic curves (AUCs) to quantify our ability to predict therapeutic resistance in individual patients, where AUC=1.0 denotes perfect prediction and AUC=0.5 denotes a coin flip, using data from 4601 patients with newly diagnosed AML given induction therapy with 3+7 or more intense standard regimens in UK Medical Research Council/National Cancer Research Institute, Dutch–Belgian Cooperative Trial Group for Hematology/Oncology/Swiss Group for Clinical Cancer Research, US cooperative group SWOG and MD Anderson Cancer Center studies. Age, performance status, white blood cell count, secondary disease, cytogenetic risk and FLT3-ITD/NPM1 mutation status were each independently associated with failure to achieve complete remission despite no early death (‘primary refractoriness’). However, the AUC of a bootstrap-corrected multivariable model predicting this outcome was only 0.78, indicating only fair predictive ability. Removal of FLT3-ITD and NPM1 information only slightly decreased the AUC (0.76). Prediction of resistance, defined as primary refractoriness or short relapse-free survival, was even more difficult. Our limited ability to forecast resistance based on routinely available pretreatment covariates provides a rationale for continued randomization between standard and new therapies and supports further examination of genetic and posttreatment data to optimize resistance prediction in AML.
Resumo:
OBJECTIVES Fontan failure (FF) represents a growing and challenging indication for paediatric orthotopic heart transplantation (OHT). The aim of this study was to identify predictors of the best mid-term outcome in OHT after FF. METHODS Twenty-year multi-institutional retrospective analysis on OHT for FF. RESULTS Between 1991 and 2011, 61 patients, mean age 15.0 ± 9.7 years, underwent OHT for failing atriopulmonary connection (17 patients = 27.8%) or total cavopulmonary connection (44 patients = 72.2%). Modality of FF included arrhythmia (14.8%), complex obstructions in the Fontan circuit (16.4%), protein-losing enteropathy (PLE) (22.9%), impaired ventricular function (31.1%) or a combination of the above (14.8%). The mean time interval between Fontan completion and OHT was 10.7 ± 6.6 years. Early FF occurred in 18%, requiring OHT 0.8 ± 0.5 years after Fontan. The hospital mortality rate was 18.3%, mainly secondary to infection (36.4%) and graft failure (27.3%). The mean follow-up was 66.8 ± 54.2 months. The overall Kaplan-Meier survival estimate was 81.9 ± 1.8% at 1 year, 73 ± 2.7% at 5 years and 56.8 ± 4.3% at 10 years. The Kaplan-Meier 5-year survival estimate was 82.3 ± 5.9% in late FF and 32.7 ± 15.0% in early FF (P = 0.0007). Late FF with poor ventricular function exhibited a 91.5 ± 5.8% 5-year OHT survival. PLE was cured in 77.7% of hospital survivors, but the 5-year Kaplan-Meier survival estimate in PLE was 46.3 ± 14.4 vs 84.3 ± 5.5% in non-PLE (P = 0.0147). Cox proportional hazards identified early FF (P = 0.0005), complex Fontan pathway obstruction (P = 0.0043) and PLE (P = 0.0033) as independent predictors of 5-year mortality. CONCLUSIONS OHT is an excellent surgical option for late FF with impaired ventricular function. Protein dispersion improves with OHT, but PLE negatively affects the mid-term OHT outcome, mainly for early infective complications.
Resumo:
AIMS To assess incidence rates (IRs) of and identify risk factors for incident severe hypoglycaemia in patients with type 2 diabetes newly treated with antidiabetic drugs. METHODS Using the UK-based General Practice Research Database, we performed a retrospective cohort study between 1994 and 2011 and a nested case-control analysis. Ten controls from the population at risk were matched to each case with a recorded severe hypoglycaemia during follow-up on general practice, years of history in the database and calendar time. Using multivariate conditional logistic regression analyses, we adjusted for potential confounders. RESULTS Of 130,761 patients with newly treated type 2 diabetes (mean age 61.7 ± 13.0 years), 690 (0.5%) had an incident episode of severe hypoglycaemia recorded [estimated IR 11.97 (95% confidence interval, CI, 11.11-12.90) per 10,000 person-years (PYs)]. The IR was markedly higher in insulin users [49.64 (95% CI, 44.08-55.89) per 10,000 PYs] than in patients not using insulin [8.03 (95% CI, 7.30-8.84) per 10,000 PYs]. Based on results of the nested case-control analysis increasing age [≥ 75 vs. 20-59 years; adjusted odds ratio (OR), 2.27; 95% CI, 1.65-3.12], cognitive impairment/dementia (adjusted OR, 2.00; 95% CI, 1.37-2.91), renal failure (adjusted OR, 1.34; 95% CI, 1.04-1.71), current use of sulphonylureas (adjusted OR, 4.45; 95% CI, 3.53-5.60) and current insulin use (adjusted OR, 11.83; 95% CI, 9.00-15.54) were all associated with an increased risk of severe hypoglycaemia. CONCLUSIONS Severe hypoglycaemia was recorded in 12 cases per 10,000 PYs. Risk factors for severe hypoglycaemia included increasing age, renal failure, cognitive impairment/dementia, and current use of insulin or sulphonylureas.
Resumo:
PurposeTo assess clinical outcomes and patterns of loco-regional failure (LRF) in relation to clinical target volumes (CTV) in patients with locally advanced hypopharyngeal and laryngeal squamous cell carcinoma (HL-SCC) treated with definitive intensity modulated radiotherapy (IMRT) and concurrent systemic therapy.MethodsData from HL-SCC patients treated from 2007 to 2010 were retrospectively evaluated. Primary endpoint was loco-regional control (LRC). Secondary endpoints included local (LC) and regional (RC) controls, distant metastasis free survival (DMFS), laryngectomy free survival (LFS), overall survival (OS), and acute and late toxicities. Time-to-event endpoints were estimated using Kaplan-Meier method, and univariate and multivariate analyses were performed using Cox proportional hazards models. Recurrent gross tumor volume (RTV) on post-treatment diagnostic imaging was analyzed in relation to corresponding CTV (in-volume, > 95% of RTV inside CTV; marginal, 20¿95% inside CTV; out-volume, < 20% inside CTV).ResultsFifty patients (stage III: 14, IVa: 33, IVb: 3) completed treatment and were included in the analysis (median follow-up of 4.2 years). Three-year LRC, DMFS and overall survival (OS) were 77%, 96% and 63%, respectively. Grade 2 and 3 acute toxicity were 38% and 62%, respectively; grade 2 and 3 late toxicity were 23% and 15%, respectively. We identified 10 patients with LRF (8 local, 1 regional, 1 local¿+¿regional). Six out of 10 RTVs were fully included in both elective and high-dose CTVs, and 4 RTVs were marginal to the high-dose CTVs.ConclusionThe treatment of locally advanced HL-SCC with definitive IMRT and concurrent systemic therapy provides good LRC rates with acceptable toxicity profile. Nevertheless, the analysis of LRFs in relation to CTVs showed in-volume relapses to be the major mode of recurrence indicating that novel strategies to overcome radioresistance are required.
Resumo:
BACKGROUND A precise detection of volume change allows for better estimating the biological behavior of the lung nodules. Postprocessing tools with automated detection, segmentation, and volumetric analysis of lung nodules may expedite radiological processes and give additional confidence to the radiologists. PURPOSE To compare two different postprocessing software algorithms (LMS Lung, Median Technologies; LungCARE®, Siemens) in CT volumetric measurement and to analyze the effect of soft (B30) and hard reconstruction filter (B70) on automated volume measurement. MATERIAL AND METHODS Between January 2010 and April 2010, 45 patients with a total of 113 pulmonary nodules were included. The CT exam was performed on a 64-row multidetector CT scanner (Somatom Sensation, Siemens, Erlangen, Germany) with the following parameters: collimation, 24x1.2 mm; pitch, 1.15; voltage, 120 kVp; reference tube current-time, 100 mAs. Automated volumetric measurement of each lung nodule was performed with the two different postprocessing algorithms based on two reconstruction filters (B30 and B70). The average relative volume measurement difference (VME%) and the limits of agreement between two methods were used for comparison. RESULTS At soft reconstruction filters the LMS system produced mean nodule volumes that were 34.1% (P < 0.0001) larger than those by LungCARE® system. The VME% was 42.2% with a limit of agreement between -53.9% and 138.4%.The volume measurement with soft filters (B30) was significantly larger than with hard filters (B70); 11.2% for LMS and 1.6% for LungCARE®, respectively (both with P < 0.05). LMS measured greater volumes with both filters, 13.6% for soft and 3.8% for hard filters, respectively (P < 0.01 and P > 0.05). CONCLUSION There is a substantial inter-software (LMS/LungCARE®) as well as intra-software variability (B30/B70) in lung nodule volume measurement; therefore, it is mandatory to use the same equipment with the same reconstruction filter for the follow-up of lung nodule volume.
Resumo:
INTRODUCTION AND OBJECTIVES There is continued debate about the routine use of aspiration thrombectomy in patients with ST-segment elevation myocardial infarction. Our aim was to evaluate clinical and procedural outcomes of aspiration thrombectomy-assisted primary percutaneous coronary intervention compared with conventional primary percutaneous coronary intervention in patients with ST-segment elevation myocardial infarction. METHODS We performed a meta-analysis of 26 randomized controlled trials with a total of 11 943 patients. Clinical outcomes were extracted up to maximum follow-up and random effect models were used to assess differences in outcomes. RESULTS We observed no difference in the risk of all-cause death (pooled risk ratio = 0.88; 95% confidence interval, 0.74-1.04; P = .124), reinfarction (pooled risk ratio = 0.85; 95% confidence interval, 0.67-1.08; P = .176), target vessel revascularization (pooled risk ratio = 0.86; 95% confidence interval, 0.73-1.00; P = .052), or definite stent thrombosis (pooled risk ratio = 0.76; 95% confidence interval, 0.49-1.16; P = .202) between the 2 groups at a mean weighted follow-up time of 10.4 months. There were significant reductions in failure to reach Thrombolysis In Myocardial Infarction 3 flow (pooled risk ratio = 0.70; 95% confidence interval, 0.60-0.81; P < .001) or myocardial blush grade 3 (pooled risk ratio = 0.76; 95% confidence interval, 0.65-0.89; P = .001), incomplete ST-segment resolution (pooled risk ratio = 0.72; 95% confidence interval, 0.62-0.84; P < .001), and evidence of distal embolization (pooled risk ratio = 0.61; 95% confidence interval, 0.46-0.81; P = .001) with aspiration thrombectomy but estimates were heterogeneous between trials. CONCLUSIONS Among unselected patients with ST-segment elevation myocardial infarction, aspiration thrombectomy-assisted primary percutaneous coronary intervention does not improve clinical outcomes, despite improved epicardial and myocardial parameters of reperfusion. Full English text available from:www.revespcardiol.org/en.
Resumo:
BACKGROUND Ultrathin strut biodegradable polymer sirolimus-eluting stents (BP-SES) proved noninferior to durable polymer everolimus-eluting stents (DP-EES) for a composite clinical end point in a population with minimal exclusion criteria. We performed a prespecified subgroup analysis of the Ultrathin Strut Biodegradable Polymer Sirolimus-Eluting Stent Versus Durable Polymer Everolimus-Eluting Stent for Percutaneous Coronary Revascularisation (BIOSCIENCE) trial to compare the performance of BP-SES and DP-EES in patients with diabetes mellitus. METHODS AND RESULTS BIOSCIENCE trial was an investigator-initiated, single-blind, multicentre, randomized, noninferiority trial comparing BP-SES versus DP-EES. The primary end point, target lesion failure, was a composite of cardiac death, target-vessel myocardial infarction, and clinically indicated target lesion revascularization within 12 months. Among a total of 2119 patients enrolled between February 2012 and May 2013, 486 (22.9%) had diabetes mellitus. Overall diabetic patients experienced a significantly higher risk of target lesion failure compared with patients without diabetes mellitus (10.1% versus 5.7%; hazard ratio [HR], 1.80; 95% confidence interval [CI], 1.27-2.56; P=0.001). At 1 year, there were no differences between BP-SES versus DP-EES in terms of the primary end point in both diabetic (10.9% versus 9.3%; HR, 1.19; 95% CI, 0.67-2.10; P=0.56) and nondiabetic patients (5.3% versus 6.0%; HR, 0.88; 95% CI, 0.58-1.33; P=0.55). Similarly, no significant differences in the risk of definite or probable stent thrombosis were recorded according to treatment arm in both study groups (4.0% versus 3.1%; HR, 1.30; 95% CI, 0.49-3.41; P=0.60 for diabetic patients and 2.4% versus 3.4%; HR, 0.70; 95% CI, 0.39-1.25; P=0.23, in nondiabetics). CONCLUSIONS In the prespecified subgroup analysis of the BIOSCIENCE trial, clinical outcomes among diabetic patients treated with BP-SES or DP-EES were comparable at 1 year. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifier: NCT01443104.
Resumo:
INTRODUCTION Patients admitted to intensive care following surgery for faecal peritonitis present particular challenges in terms of clinical management and risk assessment. Collaborating surgical and intensive care teams need shared perspectives on prognosis. We aimed to determine the relationship between dynamic assessment of trends in selected variables and outcomes. METHODS We analysed trends in physiological and laboratory variables during the first week of intensive care unit (ICU) stay in 977 patients at 102 centres across 16 European countries. The primary outcome was 6-month mortality. Secondary endpoints were ICU, hospital and 28-day mortality. For each trend, Cox proportional hazards (PH) regression analyses, adjusted for age and sex, were performed for each endpoint. RESULTS Trends over the first 7 days of the ICU stay independently associated with 6-month mortality were worsening thrombocytopaenia (mortality: hazard ratio (HR) = 1.02; 95% confidence interval (CI), 1.01 to 1.03; P <0.001) and renal function (total daily urine output: HR =1.02; 95% CI, 1.01 to 1.03; P <0.001; Sequential Organ Failure Assessment (SOFA) renal subscore: HR = 0.87; 95% CI, 0.75 to 0.99; P = 0.047), maximum bilirubin level (HR = 0.99; 95% CI, 0.99 to 0.99; P = 0.02) and Glasgow Coma Scale (GCS) SOFA subscore (HR = 0.81; 95% CI, 0.68 to 0.98; P = 0.028). Changes in renal function (total daily urine output and renal component of the SOFA score), GCS component of the SOFA score, total SOFA score and worsening thrombocytopaenia were also independently associated with secondary outcomes (ICU, hospital and 28-day mortality). We detected the same pattern when we analysed trends on days 2, 3 and 5. Dynamic trends in all other measured laboratory and physiological variables, and in radiological findings, changes inrespiratory support, renal replacement therapy and inotrope and/or vasopressor requirements failed to be retained as independently associated with outcome in multivariate analysis. CONCLUSIONS Only deterioration in renal function, thrombocytopaenia and SOFA score over the first 2, 3, 5 and 7 days of the ICU stay were consistently associated with mortality at all endpoints. These findings may help to inform clinical decision making in patients with this common cause of critical illness.