944 resultados para Critically ill patient
Resumo:
The aim of the present study was to investigate the effects of continuous and acute L-carnitine supplementation of total parenteral nutrition (TPN) on protein and fat oxidation in severe catabolism. A critically ill and severely malnourished male patient received TPN (non protein energy = 41 kcal/kg/day, provided equally as fat and glucose) over 38 days, without L-carnitine for 23 days and with carnitine supplements (15 mg/kg/day) for the following 15 days. Subsequently, he was given carnitine-free enteral nutrition for 60 more days. A four-hour infusion of 100 mg L-carnitine was given on day 11 of each TPN period. Indirect calorimetry was carried out after 11 days of either carnitine-free or supplemented TPN and at the initiation of enteral nutrition. Additional measurements were performed 4 hours and 24 hours after the acute infusions of carnitine. The rate of protein oxidation and the respiratory quotient were found to be higher, and the rate of fat oxidation to be lower, with carnitine-supplemented TPN, than with either carnitine-free TPN or enteral nutrition. Acute infusion of carnitine resulted in an increased rate of protein oxidation and a reduced rate of fat oxidation on both TPN-regimens. These unfavourable effects on protein metabolism may be due to an impairment of fat oxidation by excess amounts of carnitine.
Resumo:
Background The aim if this study was to compare percutaneous drainage (PD) of the gallbladder to emergency cholecystectomy (EC) in a well-defined patient group with sepsis related to acute calculous/acalculous cholecystitis (ACC/AAC).Methods Between 2001 and 2007, all consecutive patients of our ICU treated by either PD or EC were retrospectively analyzed. Cases were collected from a prospective database. Percutaneous drainage was performed by a transhepatic route and EC by open or laparoscopic approach. Patients' general condition and organ dysfunction were assessed by two validated scoring systems (SAPS II and SOFA, respectively). Morbidity, mortality, and long-term outcome were systematically reviewed and analyzed in both groups.Results Forty-two patients [median age = 65.5 years (range = 32-94)] were included; 45% underwent EC (ten laparoscopic, nine open) and 55% PD (n = 23). Both patient groups had similar preoperative characteristics. Percutaneous drainage and EC were successful in 91 and 100% of patients, respectively. Organ dysfunctions were similarly improved by the third postoperative/postdrainage days. Despite undergoing PD, two patients required EC due to gangrenous cholecystitis. The conversion rate after laparoscopy was 20%. Overall morbidity was 8.7% after PD and 47% after EC (P = 0.011). Major morbidity was 0% after PD and 21% after EC (P = 0.034). The mortality rate was not different (13% after PD and 16% after EC, P = 1.0) and the deaths were all related to the patients' preexisting disease. Hospital and ICU stays were not different. Recurrent symptoms (17%) occurred only after ACC in the PD group.Conclusions In high-risk patients, PD and EC are both efficient in the resolution of acute cholecystitis sepsis. However, EC is associated with a higher procedure-related morbidity and the laparoscopic approach is not always possible. Percutaneous drainage represents a valuable intervention, but secondary cholecystectomy is mandatory in cases of acute calculous cholecystitis.
Resumo:
BACKGROUND: Enteral nutrition (EN) is recommended for patients in the intensive-care unit (ICU), but it does not consistently achieve nutritional goals. We assessed whether delivery of 100% of the energy target from days 4 to 8 in the ICU with EN plus supplemental parenteral nutrition (SPN) could optimise clinical outcome. METHODS: This randomised controlled trial was undertaken in two centres in Switzerland. We enrolled patients on day 3 of admission to the ICU who had received less than 60% of their energy target from EN, were expected to stay for longer than 5 days, and to survive for longer than 7 days. We calculated energy targets with indirect calorimetry on day 3, or if not possible, set targets as 25 and 30 kcal per kg of ideal bodyweight a day for women and men, respectively. Patients were randomly assigned (1:1) by a computer-generated randomisation sequence to receive EN or SPN. The primary outcome was occurrence of nosocomial infection after cessation of intervention (day 8), measured until end of follow-up (day 28), analysed by intention to treat. This trial is registered with ClinicalTrials.gov, number NCT00802503. FINDINGS: We randomly assigned 153 patients to SPN and 152 to EN. 30 patients discontinued before the study end. Mean energy delivery between day 4 and 8 was 28 kcal/kg per day (SD 5) for the SPN group (103% [SD 18%] of energy target), compared with 20 kcal/kg per day (7) for the EN group (77% [27%]). Between days 9 and 28, 41 (27%) of 153 patients in the SPN group had a nosocomial infection compared with 58 (38%) of 152 patients in the EN group (hazard ratio 0·65, 95% CI 0·43-0·97; p=0·0338), and the SPN group had a lower mean number of nosocomial infections per patient (-0·42 [-0·79 to -0·05]; p=0·0248). INTERPRETATION: Individually optimised energy supplementation with SPN starting 4 days after ICU admission could reduce nosocomial infections and should be considered as a strategy to improve clinical outcome in patients in the ICU for whom EN is insufficient. FUNDING: Foundation Nutrition 2000Plus, ICU Quality Funds, Baxter, and Fresenius Kabi.
Resumo:
The antibiotic pipeline continues to diminish and the majority of the public remains unaware of this critical situation. The cause of the decline of antibiotic development is multifactorial and currently most ICUs are confronted with the challenge of multidrug-resistant organisms. Antimicrobial multidrug resistance is expanding all over the world, with extreme and pandrug resistance being increasingly encountered, especially in healthcare-associated infections in large highly specialized hospitals. Antibiotic stewardship for critically ill patients translated into the implementation of specific guidelines, largely promoted by the Surviving Sepsis Campaign, targeted at education to optimize choice, dosage, and duration of antibiotics in order to improve outcomes and reduce the development of resistance. Inappropriate antimicrobial therapy, meaning the selection of an antibiotic to which the causative pathogen is resistant, is a consistent predictor of poor outcomes in septic patients. Therefore, pharmacokinetically/pharmacodynamically optimized dosing regimens should be given to all patients empirically and, once the pathogen and susceptibility are known, local stewardship practices may be employed on the basis of clinical response to redefine an appropriate regimen for the patient. This review will focus on the most severely ill patients, for whom substantial progress in organ support along with diagnostic and therapeutic strategies markedly increased the risk of nosocomial infections.
Resumo:
PURPOSE: Adequate empirical antibiotic dose selection for critically ill burn patients is difficult due to extreme variability in drug pharmacokinetics. Therapeutic drug monitoring (TDM) may aid antibiotic prescription and implementation of initial empirical antimicrobial dosage recommendations. This study evaluated how gradual TDM introduction altered empirical dosages of meropenem and imipenem/cilastatin in our burn ICU. METHODS: Imipenem/cilastatin and meropenem use and daily empirical dosage at a five-bed burn ICU were analyzed retrospectively. Data for all burn admissions between 2001 and 2011 were extracted from the hospital's computerized information system. For each patient receiving a carbapenem, episodes of infection were reviewed and scored according to predefined criteria. Carbapenem trough serum levels were characterized. Prior to May 2007, TDM was available only by special request. Real-time carbapenem TDM was introduced in June 2007; it was initially available weekly and has been available 4 days a week since 2010. RESULTS: Of 365 patients, 229 (63%) received antibiotics (109 received carbapenems). Of 23 TDM determinations for imipenem/cilastatin, none exceeded the predefined upper limit and 11 (47.8%) were insufficient; the number of TDM requests was correlated with daily dose (r=0.7). Similar numbers of inappropriate meropenem trough levels (30.4%) were below and above the upper limit. Real-time TDM introduction increased the empirical dose of imipenem/cilastatin, but not meropenem. CONCLUSIONS: Real-time carbapenem TDM availability significantly altered the empirical daily dosage of imipenem/cilastatin at our burn ICU. Further studies are needed to evaluate the individual impact of TDM-based antibiotic adjustment on infection outcomes in these patients.
Resumo:
Hypomagnesemia is the most common electrolyte disturbance seen upon admission to the intensive care unit (ICU). Reliable predictors of its occurrence are not described. The objective of this prospective study was to determine factors predictive of hypomagnesemia upon admission to the ICU. In a single tertiary cancer center, 226 patients with different diagnoses upon entering were studied. Hypomagnesemia was defined by serum levels <1.5 mg/dl. Demographic data, type of cancer, cause of admission, previous history of arrhythmia, cardiovascular disease, renal failure, drug administration (particularly diuretics, antiarrhythmics, chemotherapy and platinum compounds), previous nutrition intake and presence of hypovolemia were recorded for each patient. Blood was collected for determination of serum magnesium, potassium, sodium, calcium, phosphorus, blood urea nitrogen and creatinine levels. Upon admission, 103 (45.6%) patients had hypomagnesemia and 123 (54.4%) had normomagnesemia. A normal dietary habit prior to ICU admission was associated with normal Mg levels (P = 0.007) and higher average levels of serum Mg (P = 0.002). Postoperative patients (N = 182) had lower levels of serum Mg (0.60 ± 0.14 mmol/l compared with 0.66 ± 0.17 mmol/l, P = 0.006). A stepwise multiple linear regression disclosed that only normal dietary habits (OR = 0.45; CI = 0.26-0.79) and the fact of being a postoperative patient (OR = 2.42; CI = 1.17-4.98) were significantly correlated with serum Mg levels (overall model probability = 0.001). These findings should be used to identify patients at risk for such disturbance, even in other critically ill populations.
Resumo:
Our objective was to compare the pattern of organ dysfunctions and outcomes of critically ill patients with systemic lupus erythematosus (SLE) with patients with other systemic rheumatic diseases (SRD). We studied 116 critically ill SRD patients, 59 SLE and 57 other-SRD patients. The SLE group was younger and included more women. Respiratory failure (61%) and shock (39%) were the most common causes of ICU admission for other-SRD and SLE groups, respectively. ICU length-of-stay was similar for the two groups. The 60-day survival adjusted for the groups’ baseline imbalances was not different (P = 0.792). Total SOFA scores were equal for the two groups at admission and during ICU stay, although respiratory function was worse in the other-SRD group at admission and renal and hematological functions were worse in the SLE group at admission. The incidence of severe respiratory dysfunction (respiratory SOFA >2) at admission was higher in the other-SRD group, whereas severe hematological dysfunction (hematological SOFA >2) during ICU stay was higher in the SLE group. SLE patients were younger and displayed a decreased incidence of respiratory failure compared to patients with other-SRDs. However, the incidences of renal and hematological failure and the presence of shock at admission were higher in the SLE group. The 60-day survival rates were similar.
Resumo:
Acute kidney injury (AKI) is common in critically ill patients. Diuretics are used without any evidence demonstrating a beneficial effect on renal function. The objective of the present study is to determine the incidence of AKI in an intensive care unit (ICU) and if there is an association between the use of furosemide and the development of AKI. The study involved a hospital cohort in which 344 patients were consecutively enrolled from January 2010 to January 2011. A total of 132 patients (75 females and 57 males, average age 64 years) remained for analysis. Most exclusions were related to ICU discharge in the first 24 h. Laboratory, sociodemographic and clinical data were collected until the development of AKI, medical discharge or patient death. The incidence of AKI was 55% (95%CI = 46-64). The predictors of AKI found by univariate analysis were septic shock: OR = 3.12, 95%CI = 1.36-7.14; use of furosemide: OR = 3.27, 95%CI = 1.57-6.80, and age: OR = 1.02 (95%CI = 1.00-1.04). Analysis of the subgroup of patients with septic shock showed that the odds ratio of furosemide was 5.5 (95%CI = 1.16-26.02) for development of AKI. Age, use of furosemide, and septic shock were predictors of AKI in critically ill patients. Use of furosemide in the subgroup of patients with sepsis/septic shock increased (68.4%) the chance of development of AKI when compared to the sample as a whole (43.9%).
Resumo:
Objective: Characterizing the transport of critically ill patients in an adult intensive care unit.Methods: Cross-sectional study in which 459 intra -hospital transports of critically ill patients were included. Data were collected from clinical records of patients and from a form with the description of the materials and equipment necessary for the procedure, description of adverse events and of the transport team.Results: A total of 459 transports of 262 critically ill patients were carried out, with an average of 51 transports per month. Patients were on ventilatory support (41.3 %) and 34.5 % in use of vasoactive drugs. Adverse events occurred in 9.4% of transports and 77.3 % of the teams were composed of physicians, nurses and nurse technicians.Conclusion: The transport of critically ill patients occurred in the morning period for performing computerized tomographies (CT scans) with patients dependent on mechanical ventilation and vasoactive drugs. During the transports the equipment was functioning, and the adverse events were attributed to clinical changes of patients.
Resumo:
Background: The rapid shallow breathing index (RSBI) is the most widely used index within intensive care units as a predictor of the outcome of weaning, but differences in measurement techniques have generated doubts about its predictive value. Objective: To investigate the influence of low levels of pressure support (PS) on the RSBI value of ill patients. Method: Prospective study including 30 patients on mechanical ventilation (MV) for 72 hours or more, ready for extubation. Prior to extubation, the RSBI was measured with the patient connected to the ventilator (Drager (TM) Evita XL) and receiving pressure support ventilation (PSV) and 5 cmH(2)O of positive end expiratory pressure or PEEP (RSBI_MIN) and then disconnected from the VM and connected to a Wright spirometer in which respiratory rate and exhaled tidal volume were recorded for 1 min (RSBI_ESP). Patients were divided into groups according to the outcome: successful extubation group (SG) and failed extubation group (FG). Results: Of the 30 patients, 11 (37%) failed the extubation process. In the within-group comparison (RSBI_MIN versus RSBI_ESP), the values for RSBI_MIN were lower in both groups: SG (34.79 +/- 4.67 and 60.95 +/- 24.64) and FG (38.64 +/- 12.31 and 80.09 +/- 20.71; p<0.05). In the between-group comparison, there was no difference in RSBI_MIN (34.79 +/- 14.67 and 38.64 +/- 12.31), however RSBI_ESP was higher in patients with extubation failure: SG (60.95 +/- 24.64) and FG (80.09 +/- 20.71; p<0.05). Conclusion: In critically ill patients on MV for more than 72h, low levels of PS overestimate the RSBI, and the index needs to be measured with the patient breathing spontaneously without the aid of pressure support.
Resumo:
[EN] Introduction: Candidemia in critically ill patients is usually a severe and life-threatening condition with a high crude mortality. Very few studies have focused on the impact of candidemia on ICU patient outcome and attributable mortality still remains controversial. This study was carried out to determine the attributable mortality of ICU-acquired candidemia in critically ill patients using propensity score matching analysis. Methods: A prospective observational study was conducted of all consecutive non-neutropenic adult patients admitted for at least seven days to 36 ICUs in Spain, France, and Argentina between April 2006 and June 2007. The probability of developing candidemia was estimated using a multivariate logistic regression model. Each patient with ICU-acquired candidemia was matched with two control patients with the nearest available Mahalanobis metric matching within the calipers defined by the propensity score. Standardized differences tests (SDT) for each variable before and after matching were calculated. Attributable mortality was determined by a modified Poisson regression model adjusted by those variables that still presented certain misalignments defined as a SDT > 10%. Results: Thirty-eight candidemias were diagnosed in 1,107 patients (34.3 episodes/1,000 ICU patients). Patients with and without candidemia had an ICU crude mortality of 52.6% versus 20.6% (P < 0.001) and a crude hospital mortality of 55.3% versus 29.6% (P = 0.01), respectively. In the propensity matched analysis, the corresponding figures were 51.4% versus 37.1% (P = 0.222) and 54.3% versus 50% (P = 0.680). After controlling residual confusion by the Poisson regression model, the relative risk (RR) of ICU- and hospital-attributable mortality from candidemia was RR 1.298 (95% confidence interval (CI) 0.88 to 1.98) and RR 1.096 (95% CI 0.68 to 1.69), respectively. Conclusions: ICU-acquired candidemia in critically ill patients is not associated with an increase in either ICU or hospital mortality.
Resumo:
BACKGROUND: Neurally adjusted ventilatory assist (NAVA) delivers assist in proportion to the patient's respiratory drive as reflected by the diaphragm electrical activity (EAdi). We examined to what extent NAVA can unload inspiratory muscles, and whether unloading is sustainable when implementing a NAVA level identified as adequate (NAVAal) during a titration procedure. METHODS: Fifteen adult, critically ill patients with a Pao(2)/fraction of inspired oxygen (Fio(2)) ratio < 300 mm Hg were studied. NAVAal was identified based on the change from a steep increase to a less steep increase in airway pressure (Paw) and tidal volume (Vt) in response to systematically increasing the NAVA level from low (NAVAlow) to high (NAVAhigh). NAVAal was implemented for 3 h. RESULTS: At NAVAal, the median esophageal pressure time product (PTPes) and EAdi values were reduced by 47% of NAVAlow (quartiles, 16 to 69% of NAVAlow) and 18% of NAVAlow (quartiles, 15 to 26% of NAVAlow), respectively. At NAVAhigh, PTPes and EAdi values were reduced by 74% of NAVAlow (quartiles, 56 to 86% of NAVAlow) and 36% of NAVAlow (quartiles, 21 to 51% of NAVAlow; p < or = 0.005 for all). Parameters during 3 h on NAVAal were not different from parameters during titration at NAVAal, and were as follows: Vt, 5.9 mL/kg predicted body weight (PBW) [quartiles, 5.4 to 7.2 mL/kg PBW]; respiratory rate (RR), 29 breaths/min (quartiles, 22 to 33 breaths/min); mean inspiratory Paw, 16 cm H(2)O (quartiles, 13 to 20 cm H(2)O); PTPes, 45% of NAVAlow (quartiles, 28 to 57% of NAVAlow); and EAdi, 76% of NAVAlow (quartiles, 63 to 89% of NAVAlow). Pao(2)/Fio(2) ratio, Paco(2), and cardiac performance during NAVAal were unchanged, while Paw and Vt were lower, and RR was higher when compared to conventional ventilation before implementing NAVAal. CONCLUSIONS: Systematically increasing the NAVA level reduces respiratory drive, unloads respiratory muscles, and offers a method to determine an assist level that results in sustained unloading, low Vt, and stable cardiopulmonary function when implemented for 3 h.
Resumo:
INTRODUCTION: The paucity of data on resource use in critically ill patients with hematological malignancy and on these patients' perceived poor outcome can lead to uncertainty over the extent to which intensive care treatment is appropriate. The aim of the present study was to assess the amount of intensive care resources needed for, and the effect of treatment of, hemato-oncological patients in the intensive care unit (ICU) in comparison with a nononcological patient population with a similar degree of organ dysfunction. METHODS: A retrospective cohort study of 101 ICU admissions of 84 consecutive hemato-oncological patients and 3,808 ICU admissions of 3,478 nononcological patients over a period of 4 years was performed. RESULTS: As assessed by Therapeutic Intervention Scoring System points, resource use was higher in hemato-oncological patients than in nononcological patients (median (interquartile range), 214 (102 to 642) versus 95 (54 to 224), P < 0.0001). Severity of disease at ICU admission was a less important predictor of ICU resource use than necessity for specific treatment modalities. Hemato-oncological patients and nononcological patients with similar admission Simplified Acute Physiology Score scores had the same ICU mortality. In hemato-oncological patients, improvement of organ function within the first 48 hours of the ICU stay was the best predictor of 28-day survival. CONCLUSION: The presence of a hemato-oncological disease per se is associated with higher ICU resource use, but not with increased mortality. If withdrawal of treatment is considered, this decision should not be based on admission parameters but rather on the evolutional changes in organ dysfunctions.
Resumo:
INTRODUCTION Hemodynamic management in intensive care patients guided by blood pressure and flow measurements often do not sufficiently reveal common hemodynamic problems. Trans-esophageal echocardiography (TEE) allows for direct measurement of cardiac volumes and function. A new miniaturized probe for TEE (mTEE) potentially provides a rapid and simplified approach to monitor cardiac function. The aim of the study was to assess the feasibility of hemodynamic monitoring using mTEE in critically ill patients after a brief operator training period. METHODS In the context of the introduction of mTEE in a large ICU, 14 ICU staff specialists with no previous TEE experience received six hours of training as mTEE operators. The feasibility of mTEE and the quality of the obtained hemodynamic information were assessed. Three standard views were acquired in hemodynamically unstable patients: 1) for assessment of left ventricular function (LV) fractional area change (FAC) was obtained from a trans-gastric mid-esophageal short axis view, 2) right ventricular (RV) size was obtained from mid-esophageal four chamber view, and 3) superior vena cava collapsibility for detection of hypovolemia was assessed from mid-esophageal ascending aortic short axis view. Off-line blinded assessment by an expert cardiologist was considered as a reference. Inter-rater agreement was assessed using Chi-square tests or correlation analysis as appropriate. RESULTS In 55 patients, 148 mTEE examinations were performed. Acquisition of loops in sufficient quality was possible in 110 examinations for trans-gastric mid-esophageal short axis, 118 examinations for mid-esophageal four chamber and 125 examinations for mid-esophageal ascending aortic short axis view. Inter-rater agreement (Kappa) between ICU mTEE operators and the reference was 0.62 for estimates of LV function, 0.65 for RV dilatation, 0.76 for hypovolemia and 0.77 for occurrence of pericardial effusion (all P < 0.0001). There was a significant correlation between the FAC measured by ICU operators and the reference (r = 0.794, P (one-tailed) < 0.0001). CONCLUSIONS Echocardiographic examinations using mTEE after brief bed-side training were feasible and of sufficient quality in a majority of examined ICU patients with good inter-rater reliability between mTEE operators and an expert cardiologist. Further studies are required to assess the impact of hemodynamic monitoring by mTEE on relevant patient outcomes.
Resumo:
Blood loss and bleeding complications may often be observed in critically ill patients on renal replacement therapies (RRT). Here we investigate procedural (i.e. RRT-related) and non-procedural blood loss as well as transfusion requirements in regard to the chosen mode of dialysis (i.e. intermittent haemodialysis [IHD] versus continuous veno-venous haemofiltration [CVVH]). Two hundred and fifty-two patients (122 CVVH, 159 male; aged 61.5±13.9 years) with dialysis-dependent acute renal failure were analysed in a sub-analysis of the prospective randomised controlled clinical trial-CONVINT-comparing IHD and CVVH. Bleeding complications including severity of bleeding and RRT-related blood loss were assessed. We observed that 3.6% of patients died related to severe bleeding episodes (between group P=0.94). Major all-cause bleeding complications were observed in 23% IHD versus 26% of CVVH group patients (P=0.95). Under CVVH, the rate of RRT-related blood loss events (57.4% versus 30.4%, P=0.01) and mean total blood volume lost was increased (222.3±291.9 versus 112.5±222.7 ml per patient, P <0.001). Overall, transfusion rates did not differ between the study groups. In patients with sepsis, transfusion rates of all blood products were significantly higher when compared to cardiogenic shock (all P <0.01) or other conditions. In conclusion, procedural and non-procedural blood loss may often be observed in critically ill patients on RRT. In CVVH-treated patients, procedural blood loss was increased but overall transfusion rates remained unchanged. Our data show that IHD and CVVH may be regarded as equivalent approaches in critically ill patients with dialysis-dependent acute renal failure in this regard.