196 resultados para Uppföljning Keywords: Critical care


Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE We report a case of a woman with hyperammonemic encephalopathy following glutamine supplementation. DESIGN Case report. INTERVENTIONS Plasma amino acid analysis suggestive of a urea cycle defect and initiation of a treatment with lactulose and the two ammonia scavenger drugs sodium benzoate and phenylacetate. Together with a restricted protein intake ammonia and glutamine plasma levels decreased with subsequent improvement of the neurological status. MEASUREMENTS AND MAIN RESULTS Massive catabolism and exogenous glutamine administration may have contributed to hyperammonemia and hyperglutaminemia in this patient. CONCLUSION This case adds further concerns regarding glutamine administration to critically ill patients and implies the importance of monitoring ammonia and glutamine serum levels in such patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RATIONALE The use of 6-minute-walk distance (6MWD) as an indicator of exercise capacity to predict postoperative survival in lung transplantation has not previously been well studied. OBJECTIVES To evaluate the association between 6MWD and postoperative survival following lung transplantation. METHODS Adult, first time, lung-only transplantations per the United Network for Organ Sharing database from May 2005 to December 2011 were analyzed. Kaplan-Meier methods and Cox proportional hazards modeling were used to determine the association between preoperative 6MWD and post-transplant survival after adjusting for potential confounders. A receiver operating characteristic curve was used to determine the 6MWD value that provided maximal separation in 1-year mortality. A subanalysis was performed to assess the association between 6MWD and post-transplant survival by disease category. MEASUREMENTS AND MAIN RESULTS A total of 9,526 patients were included for analysis. The median 6MWD was 787 ft (25th-75th percentiles = 450-1,082 ft). Increasing 6MWD was associated with significantly lower overall hazard of death (P < 0.001). Continuous increase in walk distance through 1,200-1,400 ft conferred an incremental survival advantage. Although 6MWD strongly correlated with survival, the impact of a single dichotomous value to predict outcomes was limited. All disease categories demonstrated significantly longer survival with increasing 6MWD (P ≤ 0.009) except pulmonary vascular disease (P = 0.74); however, the low volume in this category (n = 312; 3.3%) may limit the ability to detect an association. CONCLUSIONS 6MWD is significantly associated with post-transplant survival and is best incorporated into transplant evaluations on a continuous basis given limited ability of a single, dichotomous value to predict outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RATIONALE Changes in the pulmonary microbiota are associated with progressive respiratory diseases including chronic obstructive pulmonary disease. Whether there is a causal relationship between these changes and disease progression remains unknown. OBJECTIVE To investigate the link between an altered microbiota and disease, we utilized a model of chronic lung inflammation in specific pathogen free (SPF) mice and mice depleted of microbiota by antibiotic treatment or devoid of a microbiota (axenic). METHODS Mice were challenged with LPS/elastase intranasally over 4 weeks, resulting in a chronically inflamed and damaged lung. The ensuing cellular infiltration, histological damage and decline in lung function were quantified. MEASUREMENTS AND MAIN RESULTS Similar to human disease, the composition of the pulmonary microbiota was altered in disease animals. We found that the microbiota richness and diversity were decreased in LPS/Elastase-treated mice, with an increased representation of the genera Pseudomonas, Lactobacillus and a reduction in Prevotella. Moreover, the microbiota was implicated in disease development as mice depleted of microbiota exhibited an improvement in lung function, reduction in airway inflammation, decrease in lymphoid neogenesis and auto-reactive antibody responses. The absence of microbial cues also markedly decreased the production of IL-17A, whilst intranasal transfer of fluid enriched with the pulmonary microbiota isolated from diseased mice enhanced IL-17A production in the lungs of antibiotic treated or axenic recipients. Finally, in mice harboring a microbiota, neutralizing IL-17A dampened inflammation and restored lung function. CONCLUSIONS Collectively, our data indicate that host-microbial cross-talk promotes inflammation and could underlie the chronicity of inflammatory lung diseases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Evidence suggests that EMS-physician-guided cardiopulmonary resuscitation (CPR) in out-of-hospital cardiac arrest (OOHCA) may be associated with improved outcomes, yet randomized controlled trials are not available. The goal of this meta-analysis was to determine the association between EMS-physician- versus paramedic-guided CPR and survival after OOHCA. METHODS AND RESULTS Studies that compared EMS-physician- versus paramedic-guided CPR in OOHCA published until June 2014 were systematically searched in MEDLINE, EMBASE and Cochrane databases. All studies were required to contain survival data. Data on study characteristics, methods, and as well as survival outcomes were extracted. A random-effects model was used for the meta-analysis due to a high degree of heterogeneity among the studies (I (2)  = 44 %). Return of spontaneous circulation [ROSC], survival to hospital admission, and survival to hospital discharge were the outcome measures. Out of 3,385 potentially eligible studies, 14 met the inclusion criteria. In the pooled analysis (n = 126,829), EMS-physician-guided CPR was associated with significantly improved outcomes compared to paramedic-guided CPR: ROSC 36.2 % (95 % confidence interval [CI] 31.0 - 41.7 %) vs. 23.4 % (95 % CI 18.5 - 29.2 %) (pooled odds ratio [OR] 1.89, 95 % CI 1.36 - 2.63, p < 0.001); survival to hospital admission 30.1 % (95 % CI 24.2 - 36.7 %) vs. 19.2 % (95 % CI 12.7 - 28.1 %) (pooled OR 1.78, 95 % CI 0.97 - 3.28, p = 0.06); and survival to discharge 15.1 % (95 % CI 14.6 - 15.7 %) vs. 8.4 % (95 % CI 8.2 - 8.5 %) (pooled OR 2.03, 95 % CI 1.48 - 2.79, p < 0.001). CONCLUSIONS This systematic review suggests that EMS-physician-guided CPR in out-of-hospital cardiac arrest is associated with improved survival outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hypotension during intermittent hemodialysis is common, and has been attributed to acute volume shifts, shifts in osmolarity, electrolyte imbalance, temperature changes, altered vasoregulation, and sheer hypovolemia. Although hypovolemia may intuitively seem a likely cause for hypotension in intensive care patients, its role in the pathogenesis of intradialytic hypotension may be overestimated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE To determine changes in creatinine concentrations following the administration of 6% tetrastarch (hydroxyethyl starch [HES] 130/0.4) compared to crystalloids (CRYSs) in critically ill dogs. DESIGN Retrospective case series (2010-2013). SETTING University teaching hospital. ANIMALS Two hundred and one dogs admitted to the intensive care unit with initial plasma creatinine concentrations not exceeding laboratory reference intervals (52-117 μmol/L [0.6-1.3 mg/dL]) and receiving either CRYSs alone (CRYS group, n = 115) or HES with or without CRYSs (HES group, n = 86) for at least 24 hours. INTERVENTIONS None. MEASUREMENTS AND MAIN RESULTS Creatinine concentrations at admission to the intensive care unit (T0), and 2-13 days (T1) and 2-12 weeks (T2) after initiation of fluid therapy were analyzed. Creatinine concentrations were analyzed as absolute values and as the maximum percentage change from T0 to T1 (T1max%) and from T0 to T2 (T2max%), respectively. Creatinine concentrations were available for 192 dogs during T1 and 37 dogs during T2. The median cumulative dose of HES was 86 mL/kg (range, 12-336 mL/kg). No difference was detected between the groups for age, gender, body weight, and length of hospitalization. Outcome was significantly different between the HES (66% survived) and the CRYS (87% survived) groups (P = 0.014). No significant difference was detected between groups for creatinine concentrations at T0, T1, T2, T1max%, or T2max%. No significant difference was detected between the groups for T1max% creatinine in dogs subclassified as having systemic inflammatory response syndrome or sepsis. CONCLUSIONS HES administration in this canine population did not result in increased creatinine concentrations compared to administration of CRYSs. Further studies are needed to establish the safety of HES in critically ill dogs.