865 resultados para Central venous catheter-associated bloodstream infections
Resumo:
OBJECTIVE: Multiple organ failure is a common complication of acute circulatory and respiratory failure. We hypothesized that therapeutic interventions used routinely in intensive care can interfere with the perfusion of the gut and the liver, and thereby increase the risk of mismatch between oxygen supply and demand. DESIGN: Prospective, observational study. SETTING: Interdisciplinary intensive care unit (ICU) of a university hospital. PATIENTS: Thirty-six patients on mechanical ventilation with acute respiratory or circulatory failure or severe infection were included. INTERVENTIONS: Insertion of a hepatic venous catheter. MEASUREMENTS AND MAIN RESULTS: Daily nursing procedures were recorded. A decrease of >or=5% in hepatic venous oxygen saturation (Sho2) was considered relevant. Observation time was 64 (29-104) hours (median [interquartile range]). The ICU stay was 11 (8-15) days, and hospital mortality was 35%. The number of periods with procedures/patient was 170 (98-268), the number of procedure-related decreases in Sho2 was 29 (13-41), and the number of decreases in Sho2 unrelated to procedures was 9 (4-19). Accordingly, procedure-related Sho2 decreases occurred 11 (7-17) times per day. Median Sho2 decrease during the procedures was 7 (5-10)%, and median increase in the gradient between mixed and hepatic venous oxygen saturation was 6 (4-9)%. Procedures that caused most Sho2 decreases were airway suctioning, assessment of level of sedation, and changing patients' position. Sho2 decreases were associated with small but significant increases in heart rate and intravascular pressures. Maximal Sequential Organ Failure Assessment scores in the ICU correlated with the number of Sho2 decreases (r: .56; p < 0.001) and with the number of procedure-related Sho2 decreases (r: .60; p < 0.001). CONCLUSIONS: Patients are exposed to repeated episodes of impaired splanchnic perfusion during routine nursing procedures. More research is needed to examine the correlation, if any, between nursing procedures and hepatic venous desaturation.
Resumo:
OBJECTIVE This study tested the hypotheses that intermittent coronary sinus occlusion (iCSO) reduces myocardial ischaemia, and that the amount of ischaemia reduction is related to coronary collateral function. DESIGN Prospective case-control study with intraindividual comparison of myocardial ischaemia during two 2-min coronary artery balloon occlusions with and without simultaneous iCSO by a balloon-tipped catheter. SETTING University Hospital. PATIENTS 35 patients with chronic stable coronary artery disease. INTERVENTION 2-min iCSO. MAIN OUTCOME MEASURES Myocardial ischaemia as assessed by intracoronary (i.c.) ECG ST shift at 2 min of coronary artery balloon occlusion. Collateral flow index (CFI) without iCSO, that is, the ratio between mean distal coronary occlusive (Poccl) and mean aortic pressure (Pao) both minus central venous pressure. RESULTS I.c. ECG ST segment shift (elevation in all) at the end of the procedure with iCSO versus without iCSO was 1.33±1.25 mV versus 1.85±1.45 mV, p<0.0001. Regression analysis showed that the degree of i.c. ECG ST shift reduction during iCSO was related to CFI, best fitting a Lorentzian function (r(2)=0.61). Ischaemia reduction with iCSO was greatest at a CFI of 0.05-0.20, whereas in the low and high CFI range the effect of iCSO was absent. CONCLUSIONS ICSO reduces myocardial ischaemia in patients with chronic coronary artery disease. Ischaemia reduction by iCSO depends on coronary collateral function. A minimal degree of collateral function is necessary to render iCSO effective. ICSO cannot manifest an effect when collateral function prevents ischaemia in the first place.
Resumo:
AIMS The aim of our study in patients with coronary artery disease (CAD) and present, or absent, myocardial ischaemia during coronary occlusion was to test whether (i) left ventricular (LV) filling pressure is influenced by the collateral circulation and, on the other hand, that (ii) its resistance to flow is directly associated with LV filling pressure. METHODS AND RESULTS In 50 patients with CAD, the following parameters were obtained before and during a 60 s balloon occlusion: LV, aortic (Pao) and coronary pressure (Poccl), flow velocity (Voccl), central venous pressure (CVP), and coronary flow velocity after coronary angioplasty (V(Ø-occl)). The following variables were determined and analysed at 10 s intervals during occlusion, and at 60 s of occlusion: LV end-diastolic pressure (LVEDP), velocity-derived (CFIv) and pressure-derived collateral flow index (CFIp), coronary collateral (Rcoll), and peripheral resistance index to flow (Rperiph). Patients with ECG signs of ischaemia during coronary occlusion (insufficient collaterals, n = 33) had higher values of LVEDP over the entire course of occlusion than those without ECG signs of ischaemia during occlusion (sufficient collaterals, n = 17). Despite no ischaemia in the latter, there was an increase in LVEDP from 20 to 60 s of occlusion. In patients with insufficient collaterals, CFIv decreased and CFIp increased during occlusion. Beyond an occlusive LVEDP > 27 mmHg, Rcoll and Rperiph increased as a function of LVEDP. CONCLUSION Recruitable collaterals are reciprocally tied to LV filling pressure during occlusion. If poorly developed, they affect it via myocardial ischaemia; if well grown, LV filling pressure still increases gradually during occlusion despite the absence of ischaemia indicating transmission of collateral perfusion pressure to the LV. With low, but not high, collateral flow, resistance to collateral as well as coronary peripheral flow is related to LV filling pressure in the high range.
Resumo:
OBJECTIVE To expand the limited information on the prognostic impact of quantitatively obtained collateral function in patients with coronary artery disease (CAD) and to estimate causality of such a relation. DESIGN Prospective cohort study with long-term observation of clinical outcome. SETTING University Hospital. PATIENTS One thousand one hundred and eighty-one patients with chronic stable CAD undergoing 1771 quantitative, coronary pressure-derived collateral flow index measurements, as obtained during a 1-min coronary balloon occlusion (CFI is the ratio between mean distal coronary occlusive pressure and mean aortic pressure both subtracted by central venous pressure). Subgroup of 152 patients included in randomised trials on the longitudinal effect of different arteriogenic protocols on CFI. INTERVENTIONS Collection of long-term follow-up information on clinical outcome. MAIN OUTCOME MEASURES All-cause mortality and major adverse cardiac events. RESULTS Cumulative 15-year survival rate was 48% in patients with CFI<0.25 and 65% in the group with CFI≥0.25 (p=0.0057). Cumulative 10-year survival rate was 75% in patients without arteriogenic therapy and 88% (p=0.0482) in the group with arteriogenic therapy and showing a significant increase in CFI at follow-up. By proportional hazard analysis, the following variables predicted increased all-cause mortality: age, low CFI, left ventricular end-diastolic pressure and number of vessels with CAD. CONCLUSIONS A well-functioning coronary collateral circulation independently predicts lowered mortality in patients with chronic CAD. This relation appears to be causal, because augmented collateral function by arteriogenic therapy is associated with prolonged survival.
Resumo:
The goal of this publication is to attempt to assess the thirteen years (2001- -2014) of the West’s military presence in the countries of post-Soviet Central Asia, closely associated with the ISAF and OEF-A (Operation Enduring Freedom – Afghanistan) missions in Afghanistan. There will also be an analysis of the actual challenges for the region’s stability after 2014. The current and future security architecture in Central Asia will also be looked at closely, as will the actual capabilities to counteract the most serious threats within its framework. The need to separately handle the security system in Central Asia and security as such is dictated by the particularities of political situation in the region, the key mechanism of which is geopolitics understood as global superpower rivalry for influence with a secondary or even instrumental role of the five regional states, while ignoring their internal problems. Such an approach is especially present in Russia’s perception of Central Asia, as it views security issues in geopolitical categories. Because of this, security analysis in the Central Asian region requires a broader geopolitical context, which was taken into account in this publication. The first part investigates the impact of the Western (primarily US) military and political presence on the region’s geopolitical architecture between 2001 and 2014. The second chapter is an attempt to take an objective look at the real challenges to regional security after the withdrawal of the coalition forces from Afghanistan, while the third chapter is dedicated to analysing the probable course of events in the security dimension following 2014. The accuracy of predictions time-wise included in the below publication does not exceed three to five years due to the dynamic developments in Central Asia and its immediate vicinity (the former Soviet Union, Afghanistan, Pakistan, Iran), and because of the large degree of unpredictability of policies of one of the key regional actors – Russia (both in the terms of its activity on the international arena, and its internal developments).
Resumo:
Healthy replacement heifers are one of the foundations of a healthy dairy herd. Farm management andrearing systems in Switzerland provide a wide variety of factors that could potentially be associated withintramammary infections (IMI) in early lactating dairy heifers. In this study, IMI with minor mastitispathogens such as coagulase-negative staphylococci (CNS), contagious pathogens, and environmentalmajor pathogens were identified. Fifty-four dairy farms were enrolled in the study. A questionnaire wasused to collect herd level data on housing, management and welfare of young stock during farm isitsand interviews with the farmers. Cow-level data such as breed, age at first calving, udder condition andswelling, and calving ease were also recorded. Data was also collected about young stock that spent aperiod of at least 3 months on an external rearing farm or on a seasonal alpine farm. At the quarterlevel, teat conditions such as teat lesions, teat dysfunction, presence of a papilloma and teat lengthwere recorded. Within 24 h after parturition, samples of colostral milk from 1564 quarters (391 heifers)were collected aseptically for bacterial culture. Positive bacteriological culture results were found in 49%of quarter samples. Potential risk factors for IMI were identified at the quarter, animal and herd levelusing multivariable and multilevel logistic regression analysis. At the herd level tie-stalls, and at cow-level the breed category “Brown cattle” were risk factors for IMI caused by contagious major pathogenssuch as Staphylococcus aureus (S. aureus). At the quarter-level, teat swelling and teat lesions were highlyassociated with IMI caused by environmental major pathogens. At the herd level heifer rearing at externalfarms was associated with less IMI caused by major environmental pathogens. Keeping pregnant heifersin a separate group was negatively associated with IMI caused by CNS. The odds of IMI with coagulase-negative staphylococci increased if weaning age was less than 4 months and if concentrates were fed tocalves younger than 2 weeks. This study identified herd, cow- and quarter-level risk factors that may beimportant for IMI prevention in the future.
Resumo:
We undertook a clinical trial to compare the efficacy of 2% (w/v) chlorhexidine gluconate in 70% (v/v) isopropyl alcohol with the efficacy of 70% (v/v) isopropyl alcohol alone for skin disinfection to prevent peripheral venous catheter colonization and contamination. We found that the addition of 2% chlorhexidine gluconate reduced the number of peripheral venous catheters that were colonized or contaminated. © 2008 by The Society for Healthcare Epidemiology of America. All rights reserved.
Resumo:
Background: Persons in acute care settings who have indwelling urethral catheters are at higher risk of acquiring a urinary tract infection (UTI). Other complications related to prolonged indwelling urinary catheters include decreased mobility, damage to the meatus and/or urethra, increase use of antibiotics, increased length of stay, and pain. UTIs in acute care settings account for 30 to 40% of all health care associated infections (HAIs). Of these, 80% are catheter associated UTIs (CAUTIs). Purpose: To utilized the CDC (2009) bundle approach for CAUTI prevention and create a program which supports a multimodal method to improving urinary catheter use, maintenance, and removal, including a continuing competency program where role expansion is anticipated. Methods: A comprehensive review of the literature was conducted. Physicians were consulted through a power point presentation followed by a letter explaining the project, a questionnaire, and two selections of relevant literature. Nursing staff and allied health professionals from the target units of 3A and 3B medicine attended one of two lunch and learns. They were presented the project via a power point presentation and the same questionnaire as distributed to physicians. Results: Five e-learning modules, a revised policy, and clinical pathway have been developed to support staff with best practice knowledge transfer. Conclusion: Behaviour changes need to be approached with a framework, extensive consultation, and education. Sustainability of any practice change cannot occur without having completed the background work to ensure staff have access to tools to support the change.
Resumo:
Advances in neonatology resulted in reducing the mortality rate and the consequent increase in survival of newborn pre terms (PTN). On the other hand, there was also a considerable increase in the risk of developing health care-related infection (HAI) in its most invasive, especially for bloodstream. This situation is worrying, and prevent the occurrence of it is a challenge and becomes one of the priorities in the Neonatal Intensive Care Unit (NICU). Sepsis is the main cause of death in critical neonates and affects more than one million newborns each year, representing 40% of all deaths in neonates. The incidence of late sepsis can reach 50% in NICUs. Currently the major responsible for the occurrence of sepsis in developed countries is the coagulase negative Staphylococcus (CoNS), followed by S. aureus. The cases of HAIs caused by resistant isolates for major classes of antimicrobial agents have been increasingly frequent in the NICU. Therefore, vancomycin has to be prescribed more frequently, and, today, the first option in the treatment of bloodstream infections by resistant Staphylococcus. The objectives of this study were to assess the impact on late sepsis in epidemiology III NICU after the change of the use of antimicrobials protocol; check the frequency of multiresistant microorganisms; assess the number of neonates who came to death. This study was conducted in NICU Level III HC-UFU. three study groups were formed based on the use of the proposed late sepsis treatment protocol, with 216 belonging to the period A, 207 B and 209 to the C. The work was divided into three stages: Period A: data collected from neonates admitted to the unit between September 2010 to August 2011. was using treatment of late sepsis: with oxacillin and gentamicin, oxacillin and amikacin, oxacillin and cefotaxime. Period B: data were collected from March 2012 to February 2013. Data collection was started six months after protocol change. Due to the higher prevalence of CoNS, the initial protocol was changed to vancomycin and cefotaxime. Period C: data were collected from newborns inteerne in the unit from September 2013 to August 2014. Data collection was started six months after the protocol change, which occurred in March 2013. From the 632 neonates included in this study, 511 (80,8%) came from the gynecology and obstetrics department of the HC-UFU. The mean gestational age was 33 weeks and the prevailing sex was male (55,7%). Seventy-nine percent of the studied neonates were hospitalized at the NICU HC-UFU III because of complications related to the respiratory system. Suspicion of sepsis took to hospitalization in the unit of 1,9% of newborns. In general, the infection rate was 34,5%, and the most frequent infectious sepsis syndrome 81,2%. There was a tendency to reduce the number of neonates who died between periods A 11 and C (p = 0,053). From the 176 cases of late sepsis, 73 were clinical sepsis and 103 had laboratory confirmation, with greater representation of Gram positive bacteria, which corresponded to 67.2% of the isolates and CoNS the most frequent micro-organism (91,5%). There was a statistically significant difference in the reduction of isolation of Gram positive microorganisms between periods A and C (p = 0,0365) as well as in reducing multidrug-resistant CoNS (A and B period p = 0,0462 and A and C period, p = 0,158). This study concluded that: the CoNS was the main microorganism responsible for the occurrence of late sepsis in neonates in the NICU of HC-UFU; the main risk factors for the occurrence of late sepsis were: birth weight <1500 g, use of PICC and CUV, need for mechanical ventilation and parenteral nutrition, SNAPPE> 24 and length of stay more than seven days; the new empirical treatment protocol late sepsis, based on the use of vancomycin associated cefepime, it was effective, since promoted a reduction in insulation CoNS blood cultures between the pre and post implementation of the Protocol (A and C, respectively); just as there was a reduction in the number of newborns who evolved to death between periods A and C.
Resumo:
Alternations between siliciclastic, carbonate and evaporitic sedimentary systems, as recorded in the Aptian mixed succession of southern Tunisia, reflect profound palaeoceanographic and palaeoclimatic changes in this area of the southern Tethyan margin. The evolution from Urgonian-type carbonates (Berrani Formation, lower Aptian) at the base of the series, to intervals dominated by gypsum or detrital deposits in the remainder of the Aptian is thought to result from the interplay between climate change and tectonic activity that affected North Africa. Based on the evolution of clay mineral assemblages, the early Aptian is interpreted as having been dominated by slightly humid conditions, since smectitic minerals are observed. Near the early to late Aptian boundary, the onset of a gypsiferous sedimentation is associated with the appearance of palygorskite and sepiolite, which supports the installation of arid conditions in this area of the southern Tethyan margin. The evaporitic sedimentation may have also been promoted by the peculiar tectonic setting of the Bir Oum Ali area during the Aptian, where local subsidence may have been tectonically enhanced linked to the opening of northern and central Atlantic. Stress associated with the west and central African rift systems may have triggered the development of NW-SE, hemi-graben structures. Uplifted areas may have constituted potential new sources for clastic material that has been subsequently deposited during the late Aptian. Chemostratigraphic (d13C) correlation of the Bir Oum Ali succession with other peri-Tethyan regions complements biostratigraphic findings, and indicates that a potential expression of the Oceanic Anoxic Event (OAE) 1a may be preserved in this area of Tunisia. Although the characteristic negative spike at the base of this event is not recognized in the present study, a subsequent, large positive excursion with d13C values is of similar amplitude and absolute values to that reported from other peri-Tethyan regions, thus supporting the identification of isotopic segments C4-C7 of the OAE1a. The absence of the negative spike may be linked to either non preservation or non deposition: the OAE1a occurred in a global transgressive context, and since the Bir Oum Ali region was located in the innermost part of the southern Tethyan margin during most of the Aptian, stratigraphic hiatuses may have been longer than in other regions of the Tethys. This emphasizes the importance of integrating several stratigraphic disciplines (bio-, chemo- and sequence stratigraphy) when performing long-distance correlation.
Resumo:
Infection is an inevitable consequence of chronic urinary catheterisation, with associated problems of recurrent catheter encrustation and blockage experienced by approximately 50% of all long-term catheterised patients. In this work we have exploited, for the first time, the reported pathogen-induced elevation of urine pH as a trigger for ‘intelligent’ antimicrobial release from novel hydrogel drug delivery systems of 2-hydroxyethyl methacrylate and vinyl-functionalised nalidixic acid derivatives, developed as candidate infection-resistant urinary catheter coatings. Demonstrating up to 20-fold faster rates of drug release at pH 10, representing infected urine pH, than at pH 7, and achieving reductions of up to 96.5% in in vitro bacterial adherence, our paradigm of pH-responsive drug delivery, which requires no external manipulation, therefore represents a promising development towards the prevention of catheter-associated urinary tract infections (CAUTIs) in vivo.
Resumo:
Background The quest for continuous improvement of the quality of provided care is the objective of nursing care. However, the insertion and permanence of a peripheral venous catheter has been associated to complications, thus making a systematic evaluation of the performance of professionals and the management of health services important. Objective: Analyse complications that caused removal of intravenous catheters. Methods A prospective study with 64 patients of a health service of Portugal, from July to September/2015. Included patients with age 18 years, with a peripheral venous catheter. Descriptive analysis using SPSS. Ethical requirements were met. Results Two hundred three (203) intravenous catheters, in 64 patients, most elderly (section 95.3 %), with mean age of 80 years were evaluated. The catheters remained inserted between one and 12 days (mean 2 days), 66 % of the devices were removed because of complications, such as: removal by the patient (17.7 %), obstruction (17.2 %), infiltration (14.8 %), phlebitis (9.4 %) and fluid exiting the insertion site (6.4 %). The prevalence of obstruction and infiltration per patient was respectively 36 % and 39 %. Conclusions Obstruction and infiltration were the complications of higher prevalence that led to the removal and reinsertion of a new peripheral venous catheter with the possibility of increased pain, infection and hospital costs. Faced with the risk of compromising patient safety and being able to contribute to the improvement of health care, we suggest the inclusion of obstruction and infiltration in the indicators of quality of care, in order to have systematic evaluation of results, (re)plan and implement preventive measures.
Resumo:
Introducción: La IVU es muy frecuenten en la (FCI - IC), Alrededor el 60% de los pacientes con diagnóstico de IVU nosocomial corresponden a gérmenes resistente, Desde el año 2010 el CLSI disminuyó los puntos de corte de sensibilidad en las enterobacteriaceae y removió la necesidad de tamizaje y confirmación de (BLEE), en el presente trabajo se pretende determinar el perfil epidemiológico de la formulación antibiótica en pacientes con IVU nosocomial. Diseño: Se realizó un estudio observacional analítico de corte transversal. Métodos: Se realizó un análisis univariado, bivariado y multivariado. El análisis bivariado y multivariado se realizó para determinar la medida de asociación teniendo en cuenta la formulación de Carbapenemico la variable dependiente, evaluándose mediante chi cuadrado. Resultados: Se revisaron 131 urocultivos, se incluyeron 116. Los aislamientos microbiológicos más frecuentemente encontrados fueron E. Coli y K. Pneumoniae, el 43.4% de los aislamientos, presentaron expresión de BLEE, 90% de los aislamientos fueron sensibles a Cefepime. La mayoría de los modelos obtenidos mostraron una fuerte asociación entre el reporte de BLEE en antibiograma con la formulación de carbapenémicos como terapia final OR 33,12 IC 95% (2,90 – 337,4). Conclusión: La epidemiologia de la IVU nosocomial en la FCI-IC no difiere de las referencias internacionales, no hay adherencia a las guías de manejo intrahospitalario y el reporte de la palabra BLEE en el antibiograma predice la formulación de antibiótico carbapenémico por el médico que lee el urocultivo
Resumo:
In this issue Burns et al. report an estimate of the economic loss to Auckland City Hospital from cases of healthcare-associated bloodstream infection. They show that patients with infection stay longer in hospital and this must impose an opportunity cost because beds are blocked. Harder to measure costs fall on patients, their families and non-acute health services. Patients face some risk of dying from the infection.