86 resultados para 321009 Intensive Care
Resumo:
Visual inspection of a patient's urine has long been used by physicians, with colour recognised as having important clinical implications. In this review the authors will revisit this ancient pastime with relevance to contemporary medical practice.
Resumo:
Objective: To compare the incidence of ventilator-associated pneumonia (VAP) in patients ventilated in intensive care by means of circuits humidified with a hygroscopic heat-and-moisture exchanger with a bacterial viral filter (HME) or hot-water humidification with a heater wire in both inspiratory and expiratory circuit limbs (DHW) or the inspiratory limb only (SHW). Design: A prospective, randomized trial. Setting: A metropolitan teaching hospital's general intensive care unit. Patients: Three hundred eighty-one patients requiring a minimum period of mechanical ventilation of 48 hrs. Interventions: Patients were randomized to humidification with use of an HME (n = 190), SHW (n = 94), or DHW (n = 97). Measurements and Main Results. Study end points were VAP diagnosed on the basis of Clinical Pulmonary Infection Score (CPIS) (1), HME resistance after 24 hrs of use, endotracheal tube resistance, and HME use per patient. VAP occurred with similar frequency in all groups (13%, HME; 14%, DHW; 10%, SHW; p = 0.61) and was predicted only by current smoking (adjusted odds ratio [AOR], 2.1; 95% confidence interval [CI], 1.1-3.9; p =.03) and ventilation days (AOR, 1.05; 95% Cl, 1.0-1.2; p =.001); VAP was less likely for patients with an admission diagnosis of pneumonia (AOR, 0.40; 95% Cl, 0.4-0.2; p =.04). HME resistance after 24 hrs of use measured at a gas flow of 50 L/min was 0.9 cm H2O (0.4-2.9). Endotracheal tube resistance was similar for all three groups (16-19 cm H2O min/L; p =.2), as were suction frequency, secretion thickness, and blood on suctioning (p =.32, p =.06, and p =.34, respectively). The HME use per patient per day was 1.13. Conclusions: Humidification technique does not influence either VAP incidence or secretion characteristics, but HMEs may have air-flow resistance higher than manufacturer specifications after 24 hrs of use.
Resumo:
This paper is a brief report of the symposium, Improving the Evidence Base for Anaesthesia and Intensive Care, organized by the MASTER Anaesthesia Trial Study Group at the Annual Scientific Meeting of the Australian and New Zealand College of Anaesthetists, Newcastle, N.S.W. on Tuesday, May 5, 1998.
Resumo:
In a primary analysis of a large recently completed randomized trial in 915 high-risk patients undergoing major abdominal surgery, we found no difference in outcome between patients receiving perioperative epidural analgesia and those receiving IV opioids, apart from the incidence of respiratory failure. Therefore, we performed a selected number of predetermined subgroup analyses to identify specific types of patients who may have derived benefit from epidural analgesia. We found no difference in outcome between epidural and control groups in subgroups at increased risk of respiratory or cardiac complications or undergoing aortic surgery, nor in a subgroup with failed epidural block (all P > 0.05). There was a small reduction in the duration of postoperative ventilation (geometric mean [SD]: control group, 0.3 [6.5] h, versus epidural group, 0.2 [4.8] h, P = 0.048). No differences were found in length of stay in intensive care or in the hospital. There was no relationship between frequency of use of epidural analgesia in routine practice outside the trial and benefit from epidural analgesia in the trial. We found no evidence that perioperative epidural analgesia significantly influences major morbidity or mortality after major abdominal surgery.
Resumo:
A postal survey was conducted of all hospitals in Australia known to have a department of anaesthesia and an intensive care or high dependency unit. Each hospital was asked to report the anaesthetic and postoperative analgesic techniques used for the last ten cases of four common major surgical procedures-aorto-femoral bypass, repair of an abdominal aortic aneurysm, hemicolectomy and anterior resection of the rectum. Half of 76 hospitals sent a survey form completed and returned it. Responding hospitals were larger on average, than non-responding ones, but otherwise typical of them in terms of university affiliation and metropolitan versus rural location. For each of the procedures studied the proportion of cases in which epidural block was used intra- or postoperatively varied from 0% to 100%. Depending on the procedure, between 65% and 85% of hospitals used epidural block sometimes, with between 10% and 90% of patients in these hospitals being managed with this technique. There is wide variation in the use of epidural block, intra- and postoperatively, in Australia, variation that is unlikely to be explained by systematic differences between institutions in the patients seen or their suitability for one or other technique. This pattern of practice mirrors the lack of agreement about the proper place for epidural techniques evident in the recent literature. There is a widespread belief among clinicians that this is a question of great importance. Accordingly, we believe that anaesthetists and surgeons share an ethical responsibility to enter suitable patients in an appropriately designed randomized controlled trial in order to resolve this question.
Resumo:
All cells require inorganic sulfate for normal function. Sulfate is among the most important macronutrients; in cells and is the fourth most abundant anion in human plasma (300 muM). Sulfate is the major sulfur source in many organisms, and because it is a hydrophilic anion that cannot passively cross the lipid bilayer of cell membranes, all cells require a mechanism for sulfate influx and efflux to ensure an optimal supply of sulfate in the body. The class of proteins involved in moving sulfate into or out of cells is called sulfate transporters. To date, numerous sulfate transporters have been identified in tissues and cells from many origins. These include the renal sulfate transporters NaSi-1 and sat-1, the ubiquitously expressed diastrophic dysplasia sulfate transporter DTDST, the intestinal sulfate transporter DRA that is linked to congenital chloride diarrhea, and the erythrocyte anion exchanger AE1. These transporters have only been isolated in the last 10-15 years, and their physiological roles and contributions to body sulfate homeostasis are just now beginning to be determined. This review focuses on the structural and functional properties of mammalian sulfate transporters and highlights some of regulatory mechanisms that control their expression in vivo, under normal physiological and pathophysiological states.
Resumo:
Objectives: To investigate the pharmacokinetics of intravenous ciprofloxacin 200 mg every 8 h in critically ill patients on continuous veno-venous haemodiafiltration (CVVHDF), one form of continuous renal replacement therapy (CRRT). Design and setting: Open, prospective clinical study in a multidisciplinary, intensive care unit in a university-affiliated tertiary referral hospital. Patients: Sis critically ill patients with acute renal failure on CVVHDF. Interventions: Timed blood and ultrafiltrate samples were collected to allow pharmacokinetics and clearances to be calculated of initial and subsequent doses of 200 mg intravenous ciprofloxacin. CVVHD was performed with 1 l/h of dialysate and 2 l/h of predilution filtration solution, producing 3 lih of dialysis effluent. The blood was pumped at 200 ml/min using a Gambro BMM-10 blood pump through a Hospal AN69HF haemofilter,. Measurements and results: Ten pharmacokinetic profiles were measured. The CVVHDF displayed a urea clearance of 42 +/- 3 ml/min, and removed ciprofloxacin with a clearance of 37 +/- 7 ml/min. This rate was 2-2.5 greater than previously published for ciprofloxacin in other forms of CRRT. On average the CVVHDF was responsible for clearing a fifth of all ciprofloxacin eliminated (21 +/- 10%). The total body clearance of ciprofloxacin was 12.2 +/- 4.3 l/h. The trough concentration following the initial dose was 0.7 +/- 0.3 mg/l. The area under the plasma concentration time curves over a 24-h period ranged from 21 to 55 mg .h l(-1). Conclusions: Intravenous ciprofloxacin 600 mg/day in critically ill patients using this form of CRRT produced adequate plasma levels for many resistant microbes found in intensive care units.
Resumo:
The aim of this study was to determine the pharmacokinetic profile of the normal recommended dose of ceftriaxone in critically ill patients and to establish whether the current daily dosing recommendation maintains plasma concentrations adequate for antibacterial efficacy. Ceftriaxone at a recommended dose of 2 g iv was administered od to 12 critically ill patients with severe sepsis and normal serum creatinine concentrations. Blood samples were taken at predetermined intervals over the first 24 h and on day 3 for measurement of ceftriaxone concentrations. There was wide variability in drug disposition, explained by the presence of variable renal function and identified by the measurement of creatinine clearance. In nine patients with normal renal function, there was a high level of creatinine clearance(mean +/- S.D., 41 +/- 12 mL/min) and volume of distribution (20 +/- 3.3 L), which resulted in an elimination half-life of 6.4 +/- 1.1 h. In comparison with normal subjects, ceftriaxone clearance was increased 100%, volume of distribution increased 90% and the elimination half-life was similar. Three patients had substantially suboptimal plasma ceftriaxone concentrations. We confirm previous findings that ceftriaxone clearance in critically ill patients correlates with renal clearance by glomerular filtration. The elimination half-life is prolonged (21.4 +/- 9.8 h) in critically ill patients with renal failure when compared with previously published data in non-critically ill patients with renal failure. We conclude that in critically ill patients with normal renal function, inadequate plasma concentrations may result following od bolus dosing of ceftriaxone. Drug accumulation may occur in critically ill patients with renal failure.
Resumo:
Background/Aims: Patients with chronic liver disease undergoing liver transplantation have reduced body fat and muscle mass. The extent to which nutritional indicators and Child-Pugh class are predictive of postoperative outcome in adults is unclear. The aims of this study were to determine in adult patients undergoing transplant 1) the influence of preoperative Child-Pugh class and nutritional indicators on early transplant outcomes and one-year survival, 2) the relationship between nutritional indicators and Child-Pugh class and disease type. This study included 80 patients (1990-1994). Methodology: The nutritional indicators utilized were grip strength, triceps skinfold thickness and uncorrected mid-arm muscle area. Measured outcomes were ventilator time, intensive care stay, postoperative hospital stay and one-year survival. Results: Early morbidity was determined in survivors. Child-Pugh class C patients required longer ventilation and spent more time in the intensive care unit than Child-Pugh classes A and B. No significant relationships were found length of hospital stay. Relationships between the nutritional indicators (when controlled for Child-Pugh class) and early morbidity could not be determined due;to insufficient data. No relationship was established between one-year survival and Child-Pugh class or the nutritional indicators. Grip strength and mid-arm muscle area were lower in the patients in Child-Pugh:classes B and C. Parenchymal liver disease was associated with lower grip strength and mid-arm muscle area when compared to cholestatic disease. Conclusions: Child-Pugh class C is associated with greater early postoperative morbidity. Advanced Child-Pugh class is also associated with diminished muscle status and parenchymal disease.
Resumo:
The management of neurotrauma in Australia has been one of the significant public health triumphs during the last 30 years of the 20th century. State and national government agencies act in a coordinated fashion to collect data and to promote research on how to manage neurotrauma patients. Between 1970 and 1995, fatalities from road accidents decreased by 47%. Hospital admissions have decreased by 40% despite a 40% increase in the population and a 120% increase in registered vehicles. Fatalities per 10,000 registered vehicles were 8.05% in 1970 and they fell to 1.84% per vehicles in 1995, while fatalities per 10;000 population were 3 in 1970 falling to 1.11 in 1995. Hospitalization from road crashes decreased 23% between March 1988 and March 1997. Public education has steadily improved, backed by the state public health sources. A uniform code of road safety laws has been adopted, backed by legislation and legal penalties and increasing police enforcement. Clinical care of patients has improved as a result of faster communications, tele-medicine, trauma systems, the CT scanner; intensive care units, and improved monitoring. Patient rehabilitation and counseling are now carried out at units accredited by the Australian Council on Health Care Standards.