874 resultados para 110310 Intensive Care
Resumo:
Continuous epidural analgesia (CEA) and continuous spinal postoperative analgesia (CSPA) provided by a mixture of local anaesthetic and opioid are widely used for postoperative pain relief. E.g., with the introduction of so-called microcatheters, CSPA found its way particularly in orthopaedic surgery. These techniques, however, may be associated with dose-dependent side-effects as hypotension, weakness in the legs, and nausea and vomiting. At times, they may fail to offer sufficient analgesia, e.g., because of a misplaced catheter. The correct position of an epidural catheter might be confirmed by the supposedly easy and reliable epidural stimulation test (EST). The aims of this thesis were to determine a) whether the efficacy, tolerability, and reliability of CEA might be improved by adding the α2-adrenergic agonists adrenaline and clonidine to CEA, and by the repeated use of EST during CEA; and, b) the feasibility of CSPA given through a microcatheter after vascular surgery. Studies I IV were double-blinded, randomized, and controlled trials; Study V was of a diagnostic, prospective nature. Patients underwent arterial bypass surgery of the legs (I, n=50; IV, n=46), total knee arthroplasty (II, n=70; III, n=72), and abdominal surgery or thoracotomy (V, n=30). Postoperative lumbar CEA consisted of regular mixtures of ropivacaine and fentanyl either without or with adrenaline (2 µg/ml (I) and 4 µg/ml (II)) and clonidine (2 µg/ml (III)). CSPA (IV) was given through a microcatheter (28G) and contained either ropivacaine (max. 2 mg/h) or a mixture of ropivacaine (max. 1 mg/h) and morphine (max. 8 µg/h). Epidural catheter tip position (V) was evaluated both by EST at the moment of catheter placement and several times during CEA, and by epidurography as reference diagnostic test. CEA and CSPA were administered for 24 or 48 h. Study parameters included pain scores assessed with a visual analogue scale, requirements of rescue pain medication, vital signs, and side-effects. Adrenaline (I and II) had no beneficial influence as regards the efficacy or tolerability of CEA. The total amounts of epidurally-infused drugs were even increased in the adrenaline group in Study II (p=0.02, RM ANOVA). Clonidine (III) augmented pain relief with lowered amounts of epidurally infused drugs (p=0.01, RM ANOVA) and reduced need for rescue oxycodone given i.m. (p=0.027, MW-U; median difference 3 mg (95% CI 0 7 mg)). Clonidine did not contribute to sedation and its influence on haemodynamics was minimal. CSPA (IV) provided satisfactory pain relief with only limited blockade of the legs (no inter-group differences). EST (V) was often related to technical problems and difficulties of interpretation, e.g., it failed to identify the four patients whose catheters were outside the spinal canal already at the time of catheter placement. As adjuvants to lumbar CEA, clonidine only slightly improved pain relief, while adrenaline did not provide any benefit. The role of EST applied at the time of epidural catheter placement or repeatedly during CEA remains open. The microcatheter CSPA technique appeared effective and reliable, but needs to be compared to routine CEA after peripheral arterial bypass surgery.
Resumo:
Septic shock is a common killer in intensive care units (ICU). The most crucial issue concerning the outcome is the early and aggressive start of treatment aimed at normalization of hemodynamics and the early start of antibiotics during the very first hours. The optimal targets of hemodynamic treatment, or impact of hemodynamic treatment on survival after first resuscitation period are less known. The objective of this study was to evaluate different aspects of the hemodynamic pattern in septic shock with special attention to prediction of outcome. In particular components of early treatment and monitoring in the ICU were assessed. A total of 401 patients, 218 with septic shock and 192 with severe sepsis or septic shock were included in the study. The patients were treated in 24 Finnish ICUs during 1999-2005. 295 of the patients were included in the Finnish national epidemiologic Finnsepsis study. We found that the most important hemodynamic variables concerning the outcome were the mean arterial pressures (MAP) and lactate during the first six hours in ICU and the MAP and mixed venous oxygen saturation (SvO2) under 70% during first 48 hours. The MAP levels under 65 mmHg and SvO2 below 70% were the best predictive thresholds. Also the high central venous pressure (CVP) correlated to adverse outcome. We assessed the correlation and agreement of SvO2 and mean central venous oxygen saturation (ScvO2) in septic shock during first day in ICU. The mean SvO2 was below ScvO2 during early sepsis. Bias of difference was 4.2% (95% limits of agreement 8.1% to 16.5%) by Bland-Altman analysis. The difference between saturation values correlated significantly to cardiac index and oxygen delivery. Thus, the ScvO2 can not be used as a substitute of SvO2 in hemodynamic monitoring in ICU. Several biomarkers have been investigated for their ability to help in diagnosis or outcome prediction in sepsis. We assessed the predictive value of N-terminal pro brain natriuretic peptide (NT-proBNP) on mortality in severe sepsis or septic shock. The NT-proBNP levels were significantly higher in hospital nonsurvivors. The NT-proBNP 72 hrs after inclusion was independent predictor of hospital mortality. The acute cardiac load contributed to NTproBNP values at admission, but renal failure was the main confounding factor later. The accuracy of NT-proBNP, however, was not sufficient for clinical decision-making concerning the outcome prediction. The delays in start of treatment are associated to poorer prognosis in sepsis. We assessed how the early treatment guidelines were adopted, and what was the impact of early treatment on mortality in septic shock in Finland. We found that the early treatment was not optimal in Finnish hospitals and this reflected to mortality. A delayed initiation of antimicrobial agents was especially associated with unfavorable outcome.
Resumo:
The Molecular Adsorbent Recirculating System (MARS) is an extracorporeal albumin dialysis device which is used in the treatment of liver failure patients. This treatment was first utilized in Finland in 2001, and since then, over 200 patients have been treated. The aim of this thesis was to evaluate the impact of the MARS treatment on patient outcome, the clinical and biochemical variables, as well as on the psychological and economic aspects of the treatment in Finland. This thesis encompasses 195 MARS-treated patients (including patients with acute liver failure (ALF), acute-on-chronic liver failure (AOCLF) and graft failure), and a historical control group of 46 ALF patients who did not undergo MARS. All patients received a similar standard medical therapy at the same intensive care unit. The baseline data (demographics, laboratory and clinical variables) and MARS treatment-related and health-related quality-of-life data were recorded before and after treatment. The direct medical costs were determined for a period of 3.5 years.Additionally, the outcome of patients (survival, native liver recovery and need for liver transplantation) and survival predicting factors were investigated. In the outcome analysis, for the MARS-treated ALF patients, their 6-month survival (75% vs. 61%, P=0.07) and their native liver recovery rate (49% vs. 17%, P<0.001) were higher, and their need for transplantations was lower (29% vs. 57%, P= 0.001) than for the historical controls. However, the etiological distribution of the ALF patients referred to our unit has changed considerably over the past decade and the percentage of patients with a more favorable prognosis has increased. The etiology of liver failure was the most important predictor of the outcome. Other survival predicting factors in ALF included hepatic encephalopathy, the coagulation factors and the liver enzyme levels prior to MARS treatment. In terms of prognosis, the MARS treatment of the cirrhotic AOCLF patient seems meaningful only when the patient is eligible for transplantation. The MARS treatment appears to halt the progression of encephalopathy and reduce the blood concentration of neuroactive amino acids, albumin-bound and water-soluble toxins. In general, the effects of the MARS treatment seem to stabilize the patients, thus allowing additional time either for the native liver to recover, or for the patients to endure the prolonged waiting for transplantation. Furthermore, for the ALF patients, the MARS treatment appeared to be less costly and more cost-efficient than the standard medical therapy alone. In conclusion, the MARS treatment appears to have a beneficial effect on the patient outcome in ALF and in those AOCLF patients who can be bridged to transplantation.
Resumo:
Intensive care is to be provided to patients benefiting from it, in an ethical, efficient, effective and cost-effective manner. This implies a long-term qualitative and quantitative analysis of intensive care procedures and related resources. The study population consists of 2709 patients treated in the general intensive care unit (ICU) of Helsinki University Hospital. Study sectors investigate intensive care patients mortality, quality of life (QOL), Quality-Adjusted Life-Years (QALY units) and factors related to severity of illness, length of stay (LOS), patient s age, evaluation period as well as experiences and memories connected with the ICU episode. In addition, the study examines the qualities of two QOL measures, the RAND 36 Item Health Survey 1.0 (RAND-36) and the 5 Item EuroQol-5D (EQ-5D) and assesses the correlation of the test results. Patients treated in 1995 responded to the RAND-36 questionnaire in 1996. All patients, treated from 1995-2000, received a QOL questionnaires in 2001, when 1 7 years had lapsed from the intensive treatment. Response rate was 79.5 %. Main Results 1) Of the patients who died within the first year (n = 1047) 66 % died during the intensive care period or within the following month. The non-survivors were more aged than the surviving patients, had generally a higher than average APACHE II and SOFA score depicting the severity of illness, their ICU LOS was longer and hospital stay shorter than of the surviving patients (p < 0.001). Mortality of patients receiving conservative treatment was higher than of those receiving surgical treatment. Patients replying to the QOL survey in 2001 (n = 1099) had recovered well: 97 % of those lived at home. More than half considered their QOL as good or extremely good, 40 % as satisfactory and 7 % as bad. All QOL indexes of those of working-age were considerably lower (p < 0.001) than comparable figures of the age- and gender-adjusted Finnish population. The 5-year monitoring period made evident that mental recovery was slower than physical recovery. 2) The results of RAND-36 and EQ-5D correlated well (p < 0.01). The RAND-36 profile measure distinguished more clearly between the different categories of QOL and their levels. EQ-5D measured well the patient groups general QOL and the sum index was used to calculate QALY units. 3) QALY units were calculated by multiplying the time the patient survived after ICU stay or expected life-years by the EQ-5D sum index. Aging automatically lowers the number of QALY units. Patients under the age of 65 receiving conservative treatment benefited from treatment to a greater extent measured in QALY units than their peers receiving surgical treatment, but in the age group 65 and over patients with surgical treatment received higher QALY ratings than recipients of conservative treatment. 4) The intensive care experience and QOL ratings were connected. The QOL indices were statistically highest for those recipients with memories of intensive care as a positive experience, albeit their illness requiring intensive care treatment was less serious than average. No statistically significant differences were found in the QOL indices of those with negative memories, no memories or those who did not express the quality of their experiences.
Resumo:
The aim of the present thesis was to study the role of the epithelial sodium channel (ENaC) in clearance of fetal lung fluid in the newborn infant by measurement of airway epithelial expression of ENaC, of nasal transepithelial potential difference (N-PD), and of lung compliance (LC). In addition, the effect of postnatal dexamethasone on airway epithelial ENaC expression was measured in preterm infants with bronchopulmonary dysplasia (BPD). The patient population was formed of selected term newborn infants born in the Department of Obstetrics (Studies II-IV) and selected preterm newborn infants treated in the neonatal intensive care unit of the Hospital for Children and Adolescents (Studies I and IV) of the Helsinki University Central Hospital in Finland. A small population of preterm infants suffering from BPD was included in Study I. Studies I, III, and IV included airway epithelial measurement of ENaC and in Studies II and III, measurement of N-PD and LC. In Study I, ENaC expression analyses were performed in the Research Institute of the Hospital for Sick Children in Toronto, Ontario, Canada. In the following studies, analyses were performed in the Scientific Laboratory of the Hospital for Children and Adolescents. N-PD and LC measurements were performed at bedside in these hospitals. In term newborn infants, the percentage of amiloride-sensitive N-PD, a surrogate for ENaC activity, measured during the first 4 postnatal hours correlates positively with LC measured 1 to 2 days postnatally. Preterm infants with BPD had, after a therapeutic dose of dexamethasone, higher airway epithelial ENaC expression than before treatment. These patients were subsequently weaned from mechanical ventilation, probably as a result of the clearance of extra fluid from the alveolar spaces. In addition, we found that in preterm infants ENaC expression increases with gestational age (GA). In preterm infants, ENaC expression in the airway epithelium was lower than in term newborn infants. During the early postnatal period in those born both preterm and term airway epithelial βENaC expression decreased significantly. Term newborn infants delivered vaginally had a significantly smaller airway epithelial expression of αENaC after the first postnatal day than did those delivered by cesarean section. The functional studies showed no difference in N-PD between infants delivered vaginally and by cesarean section. We therefore conclude that the low airway epithelial expression of ENaC in the preterm infant and the correlation of N-PD with LC in the term infant indicate a role for ENaC in the pathogenesis of perinatal pulmonary adaptation and neonatal respiratory distress. Because dexamethasone raised ENaC expression in preterm infants with BPD, and infants were subsequently weaned from ventilator therapy, we suggest that studies on the treatment of respiratory distress in the preterm infant should include the induction of ENaC activity.
Resumo:
Acute renal failure (ARF) is a clinical syndrome characterized by rapidly decreasing glomerular filtration rate, which results in disturbances in electrolyte- and acid-base homeostasis, derangement of extracellular fluid volume, and retention of nitrogenous waste products, and is often associated with decreased urine output. ARF affects about 5-25% of patients admitted to intensive care units (ICUs), and is linked to high mortality and morbidity rates. In this thesis outcome of critically ill patients with ARF and factors related to outcome were evaluated. A total of 1662 patients from two ICUs and one acute dialysis unit in Helsinki University Hospital were included. In study I the prevalence of ARF was calculated and classified according to two ARF-specific scoring methods, the RIFLE classification and the classification created by Bellomo et al. (2001). Study II evaluated monocyte human histocompatibility leukocyte antigen-DR (HLA-DR) expression and plasma levels of one proinflammatory (interleukin (IL) 6) and two anti-inflammatory (IL-8 and IL-10) cytokines in predicting survival of critically ill ARF patients. Study III investigated serum cystatin C as a marker of renal function in ARF and its power in predicting survival of critically ill ARF patients. Study IV evaluated the effect of intermittent hemodiafiltration (HDF) on myoglobin elimination from plasma in severe rhabdomyolysis. Study V assessed long-term survival and health-related quality of life (HRQoL) in ARF patients. Neither of the ARF-specific scoring methods presented good discriminative power regarding hospital mortality. The maximum RIFLE score for the first three days in the ICU was an independent predictor of hospital mortality. As a marker of renal dysfunction, serum cystatin C failed to show benefit compared with plasma creatinine in detecting ARF or predicting patient survival. Neither cystatin C nor plasma concentrations of IL-6, IL-8, and IL-10, nor monocyte HLA-DR expression were clinically useful in predicting mortality in ARF patients. HDF may be used to clear myoglobin from plasma in rhabdomyolysis, especially if the alkalization of diuresis does not succeed. The long-term survival of patients with ARF was found to be poor. The HRQoL of those who survive is lower than that of the age- and gender-matched general population.
Resumo:
Technical or contaminated ethanol products are sometimes ingested either accidentally or on purpose. Typical misused products are black-market liquor and automotive products, e.g., windshield washer fluids. In addition to less toxic solvents, these liquids may contain the deadly methanol. Symptoms of even lethal solvent poisoning are often non-specific at the early stage. The present series of studies was carried out to develop a method for solvent intoxication breath diagnostics to speed up the diagnosis procedure conventionally based on blood tests. Especially in the case of methanol ingestion, the analysis method should be sufficiently sensitive and accurate to determine the presence of even small amounts of methanol from the mixture of ethanol and other less-toxic components. In addition to the studies on the FT-IR method, the Dräger 7110 evidential breath analyzer was examined to determine its ability to reveal a coexisting toxic solvent. An industrial Fourier transform infrared analyzer was modified for breath testing. The sample cell fittings were widened and the cell size reduced in order to get an alveolar sample directly from a single exhalation. The performance and the feasibility of the Gasmet FT-IR analyzer were tested in clinical settings and in the laboratory. Actual human breath screening studies were carried out with healthy volunteers, inebriated homeless men, emergency room patients and methanol-intoxicated patients. A number of the breath analysis results were compared to blood test results in order to approximate the blood-breath relationship. In the laboratory experiments, the analytical performance of the Gasmet FT-IR analyzer and Dräger 7110 evidential breath analyzer was evaluated by means of artificial samples resembling exhaled breath. The investigations demonstrated that a successful breath ethanol analysis by Dräger 7110 evidential breath analyzer could exclude any significant methanol intoxication. In contrast, the device did not detect very high levels of acetone, 1-propanol and 2-propanol in simulated breath. The Dräger 7110 evidential breath ethanol analyzer was not equipped to recognize the interfering component. According to the studies the Gasmet FT-IR analyzer was adequately sensitive, selective and accurate for solvent intoxication diagnostics. In addition to diagnostics, the fast breath solvent analysis proved feasible for controlling the ethanol and methanol concentration during haemodialysis treatment. Because of the simplicity of the sampling and analysis procedure, non-laboratory personnel, such as police officers or social workers, could also operate the analyzer for screening purposes.
Resumo:
Pediatric renal transplantation (TX) has evolved greatly during the past few decades, and today TX is considered the standard care for children with end-stage renal disease. In Finland, 191 children had received renal transplants by October 2007, and 42% of them have already reached adulthood. Improvements in treatment of end-stage renal disease, surgical techniques, intensive care medicine, and in immunosuppressive therapy have paved the way to the current highly successful outcomes of pediatric transplantation. In children, the transplanted graft should last for decades, and normal growth and development should be guaranteed. These objectives set considerable requirements in optimizing and fine-tuning the post-operative therapy. Careful optimization of immunosuppressive therapy is crucial in protecting the graft against rejection, but also in protecting the patient against adverse effects of the medication. In the present study, the results of a retrospective investigation into individualized dosing of immunosuppresive medication, based on pharmacokinetic profiles, therapeutic drug monitoring, graft function and histology studies, and glucocorticoid biological activity determinations, are reported. Subgroups of a total of 178 patients, who received renal transplants in 1988 2006 were included in the study. The mean age at TX was 6.5 years, and approximately 26% of the patients were <2 years of age. The most common diagnosis leading to renal TX was congenital nephrosis of the Finnish type (NPHS1). Pediatric patients in Finland receive standard triple immunosuppression consisting of cyclosporine A (CsA), methylprednisolone (MP) and azathioprine (AZA) after renal TX. Optimal dosing of these agents is important to prevent rejections and preserve graft function in one hand, and to avoid the potentially serious adverse effects on the other hand. CsA has a narrow therapeutic window and individually variable pharmacokinetics. Therapeutic monitoring of CsA is, therefore, mandatory. Traditionally, CsA monitoring has been based on pre-dose trough levels (C0), but recent pharmacokinetic and clinical studies have revealed that the immunosuppressive effect may be related to diurnal CsA exposure and blood CsA concentration 0-4 hours after dosing. The two-hour post-dose concentration (C2) has proved a reliable surrogate marker of CsA exposure. Individual starting doses of CsA were analyzed in 65 patients. A recommended dose based on a pre-TX pharmacokinetic study was calculated for each patient by the pre-TX protocol. The predicted dose was clearly higher in the youngest children than in the older ones (22.9±10.4 and 10.5±5.1 mg/kg/d in patients <2 and >8 years of age, respectively). The actually administered oral doses of CsA were collected for three weeks after TX and compared to the pharmacokinetically predicted dose. After the TX, dosing of CsA was adjusted according to clinical parameters and blood CsA trough concentration. The pharmacokinetically predicted dose and patient age were the two significant parameters explaining post-TX doses of CsA. Accordingly, young children received significantly higher oral doses of CsA than the older ones. The correlation to the actually administered doses after TX was best in those patients, who had a predicted dose clearly higher or lower (> ±25%) than the average in their age-group. Due to the great individual variation in pharmacokinetics standardized dosing of CsA (based on body mass or surface area) may not be adequate. Pre-Tx profiles are helpful in determining suitable initial CsA doses. CsA monitoring based on trough and C2 concentrations was analyzed in 47 patients, who received renal transplants in 2001 2006. C0, C2 and experienced acute rejections were collected during the post-TX hospitalization, and also three months after TX when the first protocol core biopsy was obtained. The patients who remained rejection free had slightly higher C2 concentrations, especially very early after TX. However, after the first two weeks also the trough level was higher in the rejection-free patients than in those with acute rejections. Three months after TX the trough level was higher in patients with normal histology than in those with rejection changes in the routine biopsy. Monitoring of both the trough level and C2 may thus be warranted to guarantee sufficient peak concentration and baseline immunosuppression on one hand and to avoid over-exposure on the other hand. Controlling of rejection in the early months after transplantation is crucial as it may contribute to the development of long-term allograft nephropathy. Recently, it has become evident that immunoactivation fulfilling the histological criteria of acute rejection is possible in a well functioning graft with no clinical sings or laboratory perturbations. The influence of treatment of subclinical rejection, diagnosed in 3-month protocol biopsy, to graft function and histology 18 months after TX was analyzed in 22 patients and compared to 35 historical control patients. The incidence of subclinical rejection at three months was 43%, and the patients received a standard rejection treatment (a course of increased MP) and/or increased baseline immunosuppression, depending on the severity of rejection and graft function. Glomerular filtration rate (GFR) at 18 months was significantly better in the patients who were screened and treated for subclinical rejection in comparison to the historical patients (86.7±22.5 vs. 67.9±31.9 ml/min/1.73m2, respectively). The improvement was most remarkable in the youngest (<2 years) age group (94.1±11.0 vs. 67.9±26.8 ml/min/1.73m2). Histological findings of chronic allograft nephropathy were also more common in the historical patients in the 18-month protocol biopsy. All pediatric renal TX patients receive MP as a part of the baseline immunosuppression. Although the maintenance dose of MP is very low in the majority of the patients, the well-known steroid-related adverse affects are not uncommon. It has been shown in a previous study in Finnish pediatric TX patients that steroid exposure, measured as area under concentration-time curve (AUC), rather than the dose correlates with the adverse effects. In the present study, MP AUC was measured in sixteen stable maintenance patients, and a correlation with excess weight gain during 12 months after TX as well as with height deficit was found. A novel bioassay measuring the activation of glucocorticoid receptor dependent transcription cascade was also employed to assess the biological effect of MP. Glucocorticoid bioactivity was found to be related to the adverse effects, although the relationship was not as apparent as that with serum MP concentration. The findings in this study support individualized monitoring and adjustment of immunosuppression based on pharmacokinetics, graft function and histology. Pharmacokinetic profiles are helpful in estimating drug exposure and thus identifying the patients who might be at risk for excessive or insufficient immunosuppression. Individualized doses and monitoring of blood concentrations should definitely be employed with CsA, but possibly also with steroids. As an alternative to complete steroid withdrawal, individualized dosing based on drug exposure monitoring might help in avoiding the adverse effects. Early screening and treatment of subclinical immunoactivation is beneficial as it improves the prospects of good long-term graft function.
Resumo:
Healthcare-associated infections (HAIs) are known to increase the risk for patient morbidity and mortality in different healthcare settings and thereby to cause additional costs. HAIs typically affect patients with severe underlying conditions. HAIs are prevalent also among pediatric patients, but the distribution of the types of infection and the causative agents differ from those detected in adults. The aim of this study was to obtain information on pediatric HAIs in Finland through an assessment of the surveillance of bloodstream infections (BSIs), through two outbreak investigations in a neonatal intensive care unit (NICU), and through a study of postoperative HAIs after open-heart surgery. The studies were carried out at the Hospital for Children and Adolescents of Helsinki University Central Hospital. Epidemiological features of pediatric BSIs were assessed. For the outbreak investigations, case definitions were set and data collected from microbiological and clinical records. The antimicrobial susceptibilities of the Serratia marcescens and the Candida parapsilosis isolates were determined and they were genotyped. Patient charts were reviewed for the case-control and cohort studies during the outbreak investigations, as well as for the patients who acquired surgical site infections (SSIs) after having undergone open-heart surgery. Also a prospective postdischarge study was conducted to detect postoperative HAIs in these patients. During 1999-2006, the overall annual BSI rate was 1.6/1,000 patient days (range by year, 1.2–2.1). High rates (average, 4.9 and 3.2 BSIs/1,000 patient days) were detected in hematology and neonatology units. Coagulase-negative staphylococci were the most common pathogens both hospital-wide and in each patient group. The overall mortality was 5%. The genotyping of the 15 S. marcescens isolates revealed three independent clusters. All of the 26 C. parapsilosis isolates studied proved to be indistinguishable. The NICU was overcrowded during the S. marcescens clusters. A negative correlation between C. parapsilosis BSIs and fluconazole use in the NICU was detected, and the isolates derived from a single initially susceptible strain became less susceptible to fluconazole over time. Eighty postoperative HAIs, including all severe infections, were detected during hospitalization after open-heart surgery; 34% of those HAIs were SSIs and 25% were BSIs. The postdischarge study found 65 infections that were likely to be associated with hospitalization. The majority (89%) of them were viral respiratory or gastrointestinal infections, and these often led to rehospitalizations. The annual hospital-wide BSI rates were stable, and the significant variation detected in some units could not be seen in overall rates. Further studies with data adequately adjusted for risk factors are needed to assess BSI rates in the patient groups with the highest rates (hematology, neonatology). The outbreak investigations showed that horizontal transmission was common in the NICU. Overcrowding and lapses in hand hygiene probably contributed to the spreading of the pathogens. Following long-term use of fluconazole in the NICU, resistance to fluconazole developed in C. parapsilosis. Almost one-fourth of the patients who underwent open-heart surgery acquired at least one HAI. All severe HAIs were detected during hospitalization. The postdischarge study found numerous viral infections, which often caused rehospitalization.
Resumo:
The spread of multidrug-resistant (MDR) bacteria has reached a threatening level. Extended-spectrum betalactamase- producing enterobacteriaceae (ESBLE) are now endemic in many hospitals worldwide as well as in the community, while resistance rates continue to rise steadily in Acinetobacter baumannii and Pseudomonas aeruginosa [1]. Even more alarming is the dissemination of carbapenemase-producing enterobacteriaceae (CPE), causing therapeutic and organizational problems in hospitals facing outbreaks or endemicity. This context could elicit serious concerns for the coming two decades; nevertheless, effective measures exist to stop the amplification of the problem and several axes of prevention remain to be fully exploited, leaving room for realistic hopes, at least for many parts of the world...
Resumo:
Introduction Patients post sepsis syndromes have a poor quality of life and a high rate of recurring illness or mortality. Follow-up clinics have been instituted for patients postgeneral intensive care but evidence is sparse, and there has been no clinic specifically for survivors of sepsis. The aim of this trial is to investigate if targeted screening and appropriate intervention to these patients can result in an improved quality of life (Short Form 36 health survey (SF36V.2)), decreased mortality in the first 12 months, decreased readmission to hospital and/or decreased use of health resources. Methods and analysis 204 patients postsepsis syndromes will be randomised to one of the two groups. The intervention group will attend an outpatient clinic two monthly for 6 months and receive screening and targeted intervention. The usual care group will remain under the care of their physician. To analyse the results, a baseline comparison will be carried out between each group. Generalised estimating equations will compare the SF36 domain scores between groups and across time points. Mortality will be compared between groups using a Cox proportional hazards (time until death) analysis. Time to first readmission will be compared between groups by a survival analysis. Healthcare costs will be compared between groups using a generalised linear model. Economic (health resource) evaluation will be a within-trial incremental cost utility analysis with a societal perspective. Ethics and dissemination Ethical approval has been granted by the Royal Brisbane and Women’s Hospital Human Research Ethics Committee (HREC; HREC/13/QRBW/17), The University of Queensland HREC (2013000543), Griffith University (RHS/08/14/HREC) and the Australian Government Department of Health (26/2013). The results of this study will be submitted to peer-reviewed intensive care journals and presented at national and international intensive care and/or rehabilitation conferences.
Resumo:
The coagulation system of newborn infants differs markedly from that of older children and adults. The activities of most coagulation factors and anticoagulants are low, leading to altered regulation in the formation of the key enzyme, thrombin. Timely and adequate generation of thrombin is essential, as thrombin activates platelets and many coagulation factors, cleaves fibrinogen into fibrin and activates the antithrombotic and anti-inflammatory protein C pathway. On the other hand, excess thrombin may promote thrombotic complications and exacerbate harmful inflammatory reactions. Despite the characteristic features, the newborn coagulation system can be considered physiological, since healthy newborns rarely show haemorrhagic or thrombotic complications. Sick newborns, however, often encounter clinical situations that challenge their coagulation system. The aim of this study was to clarify the behaviour of the neonatal coagulation system in selected clinical situations, with a special emphasis on the generation of thrombin. Thrombin was measured by in vivo thrombin generation markers and by thrombin generation potential in vitro. The patient groups included sick newborns undergoing intensive care and receiving fresh-frozen plasma (FFP), requiring exchange transfusions (ET) or presenting with a congenital heart defect requiring open heart surgery. Additionally, healthy newborns with inherited heterozygous factor V Leiden (FVL) mutation were studied. Thrombin generation potential was also analysed in cord plasma of healthy infants and in adults. Healthy as well as sick newborn infants showed lower total thrombin generation potential in vitro but faster initiation of thrombin generation than adults. These findings were qualitatively similar when plasma was supplemented with platelets. Platelets, however, significantly altered the effect of the major anticoagulant, activated protein C (APC), on thrombin generation potential. In accordance with previous studies, thrombin generation in healthy newborn platelet-poor plasma was resistant to the anticoagulant effects of APC, but when the plasma was supplemented with platelets APC attenuated thrombin generation significantly more in newborns than in adults. In vivo generation of thrombin was elevated in nearly all of the sick newborn infants. The low-volume FFP transfusion as opposed to the change from neonatal to adult blood in ET exerted markedly different effects on neonatal thrombin generation. FFP reduced the in vivo generation of thrombin in those newborns with the highest pretransfusional thrombin generation, thus acting as an anticoagulant agent. In those infants with lower pretransfusional thrombin generation, the effect of FFP on thrombin generation was fairly neutral. On the other hand, the combination of red blood cells and FFP, used to perform ET, significantly increased the in vivo thrombin formation and shifted the balance in the newborn coagulation system to the procoagulant direction. Cardiopulmonary bypass (CPB) also significantly increased the in vivo thrombin generation, but the thrombin generation profile during CPB differed from that previously observed in adults. Escalation of thrombin at early reperfusion was not observed in newborns; in adults, its occurrence is associated with postoperative myocardial damage. Finally, in healthy newborns with FVL heterozygosity, faster initiation of thrombin generation was observed compared with controls. Interestingly, FV level was lower in FVL-heterozygous infants, possibly to counteract the procoagulant effects induced by FVL. In conclusion, unique features regarding thrombin regulation in newborn infants were observed. These features included a novel platelet effect on the regulation of the protein C pathway. The clinical challenges mainly seemed to shift the balance in the coagulation system of newborns to the procoagulant direction. Blood component transfusions markedly affected coagulation in a manner specific to the product but that could also be altered by the clinical situation. Overall, the results highlight the need for understanding developmental haemostasis for both diagnostic and therapeutic purposes.
Resumo:
Liver transplantation is an established therapy for both acute and chronic liver failure. Despite excellent long-term outcome, graft dysfunction remains a problem affecting up to 15-30% of the recipients. The etiology of dysfunction is multifactorial, with ischemia-reperfusion injury regarded as one of the most important contributors. This thesis focuses on the inflammatory response during graft procurement and reperfusion in liver transplantation in adults. Activation of protein C was examined as a potential endogenous anti-inflammatory mechanism. The effects of inflammatory responses on graft function and outcome were investigated. Seventy adult patients undergoing liver transplantation in Helsinki University Central Hospital, and 50 multiorgan donors, were studied. Blood samples from the portal and the hepatic veins were drawn before graft procurement and at several time points during graft reperfusion to assess changes within the liver. Liver biopsies were taken before graft preservation and after reperfusion. Neutrophil and monocyte CD11b and L-selectin expression were analysed by flow cytometry. Plasma TNF-α, IL-6, IL-8, sICAM-1, and HMGB1 were determined by ELISA and Western-blotting. HMGB1 immunohistochemistry was performed on liver tissue specimens. Plasma protein C and activated protein C were determined by an enzyme-capture assay. Hepatic IL-8 release during graft procurement was associated with subsequent graft dysfunction, biliary in particular, in the recipient. Biliary marker levels increased only 5 7 days after transplantation. Thus, donor inflammatory response appears to influence recipient liver function with relatively long-lasting effects. Hepatic phagocyte activation and sequestration, with concomitant HMGB1 release, occurred during reperfusion. Neither phagocyte activation nor plasma cytokines correlated with postoperative graft function. Thus, activation of the inflammatory responses within the liver during reperfusion may be of minor clinical significance. However, HMGB1 was released from hepatocytes and were also correlated with postoperative transaminase levels. Accordingly, HMGB1 appears to be a marker of hepatocellular injury.
Resumo:
This thesis developed a model of factors that influence meeting the needs of family with a relative admitted to an adult intensive care unit. The results from the model indicate that several variables are significant in meeting the needs of families in ICU. The factors identified in this study should be considered when planning future intervention studies or implementing interventions into ICU clinical practice. Meeting the needs of families is an integral part of caring for a critically ill patient. ICU staff can minimise this stressful time for relatives by anticipating and addressing family needs.
Resumo:
Sepsis is the leading cause of death in intensive care units and results from a deleterious systemic host response to infection. Although initially perceived as potentially deleterious, catalytic antibodies have been proposed to participate in removal of metabolic wastes and protection against infection. Here we show that the presence in plasma of IgG endowed with serine protease-like hydrolytic activity strongly correlates with survival from sepsis. Variances of catalytic rates of IgG were greater in the case of patients with severe sepsis than healthy donors (P < 0.001), indicating that sepsis is associated with alterations in plasma levels of hydrolytic IgG. The catalytic rates of IgG from patients who survived were significantly greater than those of IgG from deceased patients (P < 0.05). The cumulative rate of survival was higher among patients exhibiting high rates of IgG-mediated hydrolysis as compared with patients with low hydrolytic rates (P < 0.05). An inverse correlation was also observed between the markers of severity of disseminated intravascular coagulation and rates of hydrolysis of patients' IgG. Furthermore, IgG from three surviving patients hydrolyzed factor VIII, one of which also hydrolyzed factor IX, suggesting that, in some patients, catalytic IgG may participate in the control of disseminated microvascular thrombosis. Our observations provide the first evidence that hydrolytic antibodies might play a role in recovery from a disease.