370 resultados para nosocomial
Resumo:
La neumonía nosocomial es una infección frecuente en pacientes hospitalizados, representa el 40% de las infecciones nosocomiales. El incremento en su incidencia por microorganismos multirresistentes causa un incremento en el tratamiento antibiótico empírico inapropiado asociándose a un incremento en el riesgo de mortalidad. Es importante conocer los microorganismos frecuentemente responsables de estas infecciones en cada hospital y los patrones de sensibilidad antimicrobiana para reducir la incidencia de tratamiento antibiótico inapropiado. Materiales y métodos: se realizó un estudio observacional descriptivo, longitudinal de eventos retrospectivos; fuentes documentales (expedientes clínicos) con egreso por neumonía nosocomial de los servicios del Departamento de Medicina Interna del Hospital Nacional Rosales(HNR), de enero a diciembre del año 2013. Se excluyeron pacientes trasladados de otro centro hospitalario, los que iniciaron tratamiento antibiótico empírico en unidad de cuidados intensivos y pacientes con asociación a una segunda infección nosocomial. Resultados: se incluyeron 124 pacientes, edad media de 57.91 años (desviación estándar + 20.46) 62.1% hombres y 37.9% mujeres, relación masculino/femenino 1.63:1. El antibiótico empírico de inicio más frecuente fue la monoterapia con Ceftazidima. Se reportó microorganismos de cultivo bacteriológico de esputo o secreción bronquial con sensibilidad únicamente en 30 casos (24.2%) y la toma de cultivo antes del inicio de antibióticos solamente a 11 (8.9%). Los agentes más frecuentes fueron Pseudomonas aeruginosa (con sensibilidad a Ciprofloxacina, Imipenen, Gentamicina y Linzolid) y Acinetobacter baumanni (en su mayoría multiresistente). La mortalidad reportada fue del 64.52% Conclusión: en el HNR se encontró divergencia en cuanto al cumplimiento de los lineamientos internacionales en el antibiótico empírico de inicio, se encontró una baja proporción de reporte de cultivos.
Resumo:
Background: Nosocomial sepsis (NS) in newborns (NBs) is associated with high mortality rates and low microbial recovery rates. To overcome the latter problem, new techniques in molecular biology are being used. Objectives: To evaluate the diagnostic efficacy of SeptiFast test for the diagnosis of nosocomial sepsis in the newborn. Materials and Methods: 86 blood specimens of NBs with suspected NS (NOSEP-1 Test > 8 points) were analyzed using Light Cycler SeptiFast (LC-SF) a real-time multiplex PCR instrument. The results were analyzed with the Roche SeptiFast Identification Software. Another blood sample was collected to carry out a blood culture (BC). Results: Sensitivity (Sn) and specificity (Sp) of 0.69 and 0.65 respectively, compared with blood culture (BC) were obtained for LC-SF. Kappa index concordance between LC-SF and BC was 0.21. Thirteen (15.11%) samples were BC positive and 34 (31.39%) were positive with LC-SF tests. Conclusions: Compared with BC, LC-SF allows the detection of a greater number of pathogenic species in a small blood sample (1 mL) with a shorter response time.
Resumo:
Faced with the continued emergence of antibiotic resistance to all known classes of antibiotics, a paradigm shift in approaches toward antifungal therapeutics is required. Well characterized in a broad spectrum of bacterial and fungal pathogens, biofilms are a key factor in limiting the effectiveness of conventional antibiotics. Therefore, therapeutics such as small molecules that prevent or disrupt biofilm formation would render pathogens susceptible to clearance by existing drugs. This is the first report describing the effect of the Pseudomonas aeruginosa alkylhydroxyquinolone interkingdom signal molecules 2-heptyl-3-hydroxy-4-quinolone and 2-heptyl-4-quinolone on biofilm formation in the important fungal pathogen Aspergillus fumigatus. Decoration of the anthranilate ring on the quinolone framework resulted in significant changes in the capacity of these chemical messages to suppress biofilm formation. Addition of methoxy or methyl groups at the C5–C7 positions led to retention of anti-biofilm activity, in some cases dependent on the alkyl chain length at position C2. In contrast, halogenation at either the C3 or C6 positions led to loss of activity, with one notable exception. Microscopic staining provided key insights into the structural impact of the parent and modified molecules, identifying lead compounds for further development.
Resumo:
Introducción: La IVU es muy frecuenten en la (FCI - IC), Alrededor el 60% de los pacientes con diagnóstico de IVU nosocomial corresponden a gérmenes resistente, Desde el año 2010 el CLSI disminuyó los puntos de corte de sensibilidad en las enterobacteriaceae y removió la necesidad de tamizaje y confirmación de (BLEE), en el presente trabajo se pretende determinar el perfil epidemiológico de la formulación antibiótica en pacientes con IVU nosocomial. Diseño: Se realizó un estudio observacional analítico de corte transversal. Métodos: Se realizó un análisis univariado, bivariado y multivariado. El análisis bivariado y multivariado se realizó para determinar la medida de asociación teniendo en cuenta la formulación de Carbapenemico la variable dependiente, evaluándose mediante chi cuadrado. Resultados: Se revisaron 131 urocultivos, se incluyeron 116. Los aislamientos microbiológicos más frecuentemente encontrados fueron E. Coli y K. Pneumoniae, el 43.4% de los aislamientos, presentaron expresión de BLEE, 90% de los aislamientos fueron sensibles a Cefepime. La mayoría de los modelos obtenidos mostraron una fuerte asociación entre el reporte de BLEE en antibiograma con la formulación de carbapenémicos como terapia final OR 33,12 IC 95% (2,90 – 337,4). Conclusión: La epidemiologia de la IVU nosocomial en la FCI-IC no difiere de las referencias internacionales, no hay adherencia a las guías de manejo intrahospitalario y el reporte de la palabra BLEE en el antibiograma predice la formulación de antibiótico carbapenémico por el médico que lee el urocultivo
Resumo:
The nosocomial infections are a growing concern because they affect a large number of people and they increase the admission time in healthcare facilities. Additionally, its diagnosis is very tricky, requiring multiple medical exams. So, this work is focused on the development of a clinical decision support system to prevent these events from happening. The proposed solution is unique once it caters for the explicit treatment of incomplete, unknown, or even contradictory information under a logic programming basis, that to our knowledge is something that happens for the first time.
Resumo:
Healthcare-associated methicillin-resistant Staphylococcus aureus(MRSA) infection may cause increased hospital stay or, sometimes, death. Quantifying this effect is complicated because it is a time-dependent exposure: infection may prolong hospital stay, while longer stays increase the risk of infection. We overcome these problems by using a multinomial longitudinal model for estimating the daily probability of death and discharge. We then extend the basic model to estimate how the effect of MRSA infection varies over time, and to quantify the number of excess ICU days due to infection. We find that infection decreases the relative risk of discharge (relative risk ratio = 0.68, 95% credible interval: 0.54, 0.82), but is only indirectly associated with increased mortality. An infection on the first day of admission resulted in a mean extra stay of 0.3 days (95% CI: 0.1, 0.5) for a patient with an APACHE II score of 10, and 1.2 days (95% CI: 0.5, 2.0) for a patient with an APACHE II score of 30. The decrease in the relative risk of discharge remained fairly constant with day of MRSA infection, but was slightly stronger closer to the start of infection. These results confirm the importance of MRSA infection in increasing ICU stay, but suggest that previous work may have systematically overestimated the effect size.
Resumo:
The evolution of organisms that cause healthcare acquired infections (HAI) puts extra stress on hospitals already struggling with rising costs and demands for greater productivity and cost containment. Infection control can save scarce resources, lives, and possibly a facility’s reputation, but statistics and epidemiology are not always sufficient to make the case for the added expense. Economics and Preventing Healthcare Acquired Infection presents a rigorous analytic framework for dealing with this increasingly serious problem. ----- Engagingly written for the economics non-specialist, and brimming with tables, charts, and case examples, the book lays out the concepts of economic analysis in clear, real-world terms so that infection control professionals or infection preventionists will gain competence in developing analyses of their own, and be confident in the arguments they present to decision-makers. The authors: ----- Ground the reader in the basic principles and language of economics. ----- Explain the role of health economists in general and in terms of infection prevention and control. ----- Introduce the concept of economic appraisal, showing how to frame the problem, evaluate and use data, and account for uncertainty. ----- Review methods of estimating and interpreting the costs and health benefits of HAI control programs and prevention methods. ----- Walk the reader through a published economic appraisal of an infection reduction program. ----- Identify current and emerging applications of economics in infection control. ---- Economics and Preventing Healthcare Acquired Infection is a unique resource for practitioners and researchers in infection prevention, control and healthcare economics. It offers valuable alternate perspective for professionals in health services research, healthcare epidemiology, healthcare management, and hospital administration. ----- Written for: Professionals and researchers in infection control, health services research, hospital epidemiology, healthcare economics, healthcare management, hospital administration; Association of Professionals in Infection Control (APIC), Society for Healthcare Epidemiologists of America (SHEA)
Resumo:
Introduction: Some types of antimicrobial-coated central venous catheters (A-CVC) have been shown to be cost-effective in preventing catheter-related bloodstream infection (CR-BSI). However, not all types have been evaluated, and there are concerns over the quality and usefulness of these earlier studies. There is uncertainty amongst clinicians over which, if any, antimicrobial-coated central venous catheters to use. We re-evaluated the cost-effectiveness of all commercially available antimicrobialcoated central venous catheters for prevention of catheter-related bloodstream infection in adult intensive care unit (ICU) patients. Methods: We used a Markov decision model to compare the cost-effectiveness of antimicrobial-coated central venous catheters relative to uncoated catheters. Four catheter types were evaluated; minocycline and rifampicin (MR)-coated catheters; silver, platinum and carbon (SPC)-impregnated catheters; and two chlorhexidine and silver sulfadiazine-coated catheters, one coated on the external surface (CH/SSD (ext)) and the other coated on both surfaces (CH/SSD (int/ext)). The incremental cost per qualityadjusted life-year gained and the expected net monetary benefits were estimated for each. Uncertainty arising from data estimates, data quality and heterogeneity was explored in sensitivity analyses. Results: The baseline analysis, with no consideration of uncertainty, indicated all four types of antimicrobial-coated central venous catheters were cost-saving relative to uncoated catheters. Minocycline and rifampicin-coated catheters prevented 15 infections per 1,000 catheters and generated the greatest health benefits, 1.6 quality-adjusted life-years, and cost-savings, AUD $130,289. After considering uncertainty in the current evidence, the minocycline and rifampicin-coated catheters returned the highest incremental monetary net benefits of $948 per catheter; but there was a 62% probability of error in this conclusion. Although the minocycline and rifampicin-coated catheters had the highest monetary net benefits across multiple scenarios, the decision was always associated with high uncertainty. Conclusions: Current evidence suggests that the cost-effectiveness of using antimicrobial-coated central venous catheters within the ICU is highly uncertain. Policies to prevent catheter-related bloodstream infection amongst ICU patients should consider the cost-effectiveness of competing interventions in the light of this uncertainty. Decision makers would do well to consider the current gaps in knowledge and the complexity of producing good quality evidence in this area.
Resumo:
Background: Reducing rates of healthcare acquired infection has been identified by the Australian Commission on Safety and Quality in Health Care as a national priority. One of the goals is the prevention of central venous catheter-related bloodstream infection (CR-BSI). At least 3,500 cases of CR-BSI occur annually in Australian hospitals, resulting in unnecessary deaths and costs to the healthcare system between $25.7 and $95.3 million. Two approaches to preventing these infections have been proposed: use of antimicrobial catheters (A-CVCs); or a catheter care and management ‘bundle’. Given finite healthcare budgets, decisions about the optimal infection control policy require consideration of the effectiveness and value for money of each approach. Objectives: The aim of this research is to use a rational economic framework to inform efficient infection control policy relating to the prevention of CR-BSI in the intensive care unit. It addresses three questions relating to decision-making in this area: 1. Is additional investment in activities aimed at preventing CR-BSI an efficient use of healthcare resources? 2. What is the optimal infection control strategy from amongst the two major approaches that have been proposed to prevent CR-BSI? 3. What uncertainty is there in this decision and can a research agenda to improve decision-making in this area be identified? Methods: A decision analytic model-based economic evaluation was undertaken to identify an efficient approach to preventing CR-BSI in Queensland Health intensive care units. A Markov model was developed in conjunction with a panel of clinical experts which described the epidemiology and prognosis of CR-BSI. The model was parameterised using data systematically identified from the published literature and extracted from routine databases. The quality of data used in the model and its validity to clinical experts and sensitivity to modelling assumptions was assessed. Two separate economic evaluations were conducted. The first evaluation compared all commercially available A-CVCs alongside uncoated catheters to identify which was cost-effective for routine use. The uncertainty in this decision was estimated along with the value of collecting further information to inform the decision. The second evaluation compared the use of A-CVCs to a catheter care bundle. We were unable to estimate the cost of the bundle because it is unclear what the full resource requirements are for its implementation, and what the value of these would be in an Australian context. As such we undertook a threshold analysis to identify the cost and effectiveness thresholds at which a hypothetical bundle would dominate the use of A-CVCs under various clinical scenarios. Results: In the first evaluation of A-CVCs, the findings from the baseline analysis, in which uncertainty is not considered, show that the use of any of the four A-CVCs will result in health gains accompanied by cost-savings. The MR catheters dominate the baseline analysis generating 1.64 QALYs and cost-savings of $130,289 per 1.000 catheters. With uncertainty, and based on current information, the MR catheters remain the optimal decision and return the highest average net monetary benefits ($948 per catheter) relative to all other catheter types. This conclusion was robust to all scenarios tested, however, the probability of error in this conclusion is high, 62% in the baseline scenario. Using a value of $40,000 per QALY, the expected value of perfect information associated with this decision is $7.3 million. An analysis of the expected value of perfect information for individual parameters suggests that it may be worthwhile for future research to focus on providing better estimates of the mortality attributable to CR-BSI and the effectiveness of both SPC and CH/SSD (int/ext) catheters. In the second evaluation of the catheter care bundle relative to A-CVCs, the results which do not consider uncertainty indicate that a bundle must achieve a relative risk of CR-BSI of at least 0.45 to be cost-effective relative to MR catheters. If the bundle can reduce rates of infection from 2.5% to effectively zero, it is cost-effective relative to MR catheters if national implementation costs are less than $2.6 million ($56,610 per ICU). If the bundle can achieve a relative risk of 0.34 (comparable to that reported in the literature) it is cost-effective, relative to MR catheters, if costs over an 18 month period are below $613,795 nationally ($13,343 per ICU). Once uncertainty in the decision is considered, the cost threshold for the bundle increases to $2.2 million. Therefore, if each of the 46 Level III ICUs could implement an 18 month catheter care bundle for less than $47,826 each, this approach would be cost effective relative to A-CVCs. However, the uncertainty is substantial and the probability of error in concluding that the bundle is the cost-effective approach at a cost of $2.2 million is 89%. Conclusions: This work highlights that infection control to prevent CR-BSI is an efficient use of healthcare resources in the Australian context. If there is no further investment in infection control, an opportunity cost is incurred, which is the potential for a more efficient healthcare system. Minocycline/rifampicin catheters are the optimal choice of antimicrobial catheter for routine use in Australian Level III ICUs, however, if a catheter care bundle implemented in Australia was as effective as those used in the large studies in the United States it would be preferred over the catheters if it was able to be implemented for less than $47,826 per Level III ICU. Uncertainty is very high in this decision and arises from multiple sources. There are likely greater costs to this uncertainty for A-CVCs, which may carry hidden costs, than there are for a catheter care bundle, which is more likely to provide indirect benefits to clinical practice and patient safety. Research into the mortality attributable to CR-BSI, the effectiveness of SPC and CH/SSD (int/ext) catheters and the cost and effectiveness of a catheter care bundle in Australia should be prioritised to reduce uncertainty in this decision. This thesis provides the economic evidence to inform one area of infection control, but there are many other infection control decisions for which information about the cost-effectiveness of competing interventions does not exist. This work highlights some of the challenges and benefits to generating and using economic evidence for infection control decision-making and provides support for commissioning more research into the cost-effectiveness of infection control.
Resumo:
A SNP genotyping method was developed for E. faecalis and E. faecium using the 'Minimum SNPs' program. SNP sets were interrogated using allele-specific real-time PCR. SNP-typing sub-divided clonal complexes 2 and 9 of E. faecalis and 17 of E. faecium, members of which cause the majority of nosocomial infections globally.
Resumo:
Catheter associated urinary tract infections (CAUTI) are a worldwide problem that may lead to increased patient morbidity, cost and mortality.1e3 The literature is divided on whether there are real effects from CAUTI on length of stay or mortality. Platt4 found the costs and mortality risks to be largeyetGraves et al found the opposite.5 A reviewof the published estimates of the extra length of stay showed results between zero and 30 days.6 The differences in estimates may have been caused by the different epidemiological methods applied. Accurately estimating the effects of CAUTI is difficult because it is a time-dependent exposure. This means that standard statistical techniques, such asmatched case-control studies, tend to overestimate the increased hospital stay and mortality risk due to infection. The aim of the study was to estimate excess length of stay andmortality in an intensive care unit (ICU) due to a CAUTI, using a statistical model that accounts for the timing of infection. Data collected from ICU units in lower and middle income countries were used for this analysis.7,8 There has been little research for these settings, hence the need for this paper.
Resumo:
Staphylococci are important pathogenic bacteria responsible for a range of diseases in humans. The most frequently isolated microorganisms in a hospital microbiology laboratory are staphylococci. The general classification of staphylococci divides them into two major groups; Coagulase-positive staphylococci (e.g. Staphylococcus aureus) and Coagulase-negative staphylococci (e.g. Staphylococcus epidermidis). Coagulase-negative staphylococcal (CoNS) isolates include a variety of species and many different strains but are often dominated by the most important organism of this group, S. epidermidis. Currently, these organisms are regarded as important pathogenic organisms causing infections related to prosthetic materials and surgical wounds. A significant number of S. epidermidis isolates are also resistant to different antimicrobial agents. Virulence factors in CoNS are not very clearly established and not well documented. S. epidermidis is evolving as a resistant and powerful microbe related to nosocomial infections because it has different properties which independently, and in combination, make it a successful infectious agent, especially in the hospital environment. Such characteristics include biofilm formation, drug resistance and the evolution of genetic variables. The purpose of this project was to develop a novel SNP genotyping method to genotype S. epidermidis strains originating from hospital patients and healthy individuals. High-Resolution Melt Analysis was used to assign binary typing profiles to both clinical and commensal strains using a new bioinformatics approach. The presence of antibiotic resistance genes and biofilm coding genes were also interrogated in these isolates.
Resumo:
Enterococci are versatile Gram-positive bacteria that can survive under extreme conditions. Most enterococci are non-virulent and found in the gastrointestinal tract of humans and animals. Other strains are opportunistic pathogens that contribute to a large number of nosocomial infections globally. Epidemiological studies demonstrated a direct relationship between the density of enterococci in surface waters and the risk of swimmer-associated gastroenteritis. The distribution of infectious enterococcal strains from the hospital environment or other sources to environmental water bodies through sewage discharge or other means, could increase the prevalence of these strains in the human population. Environmental water quality studies may benefit from focusing on a subset of Enterococcus spp. that are consistently associated with sources of faecal pollution such as domestic sewage, rather than testing for the entire genus. E. faecalis and E. faecium are potentially good focal species for such studies, as they have been consistently identified as the dominant Enterococcus spp. in human faeces and sewage. On the other hand enterococcal infections are predominantly caused by E. faecalis and E. faecium. The characterisation of E. faecalis and E. faecium is important in studying their population structures, particularly in environmental samples. In developing and implementing rapid, robust molecular genotyping techniques, it is possible to more accurately establish the relationship between human and environmental enterococci. Of particular importance, is to determine the distribution of high risk enterococcal clonal complexes, such as E. faecium clonal complex 17 and E. faecalis clonal complexes 2 and 9 in recreational waters. These clonal complexes are recognized as particularly pathogenic enterococcal genotypes that cause severe disease in humans globally. The Pimpama-Coomera watershed is located in South East Queensland, Australia and was investigated in this study mainly because it is used intensively for agriculture and recreational purposes and has a strong anthropogenic impact. The primary aim of this study was to develop novel, universally applicable, robust, rapid and cost effective genotyping methods which are likely to yield more definitive results for the routine monitoring of E. faecalis and E. faecium, particularly in environmental water sources. To fullfill this aim, new genotyping methods were developed based on the interrogation of highly informative single nucleotide polymorphisms (SNPs) located in housekeeping genes of both E. faecalis and E. faecium. SNP genotyping was successfully applied in field investigations of the Coomera watershed, South-East Queensland, Australia. E. faecalis and E. faecium isolates were grouped into 29 and 23 SNP profiles respectively. This study showed the high longitudinal diversity of E. faecalis and E. faecium over a period of two years, and both human-related and human-specific SNP profiles were identified. Furthermore, 4.25% of E. faecium strains isolated from water was found to correspond to the important clonal complex-17 (CC17). Strains that belong to CC17 cause the majority of hospital outbreaks and clinical infections globally. Of the six sampling sites of the Coomera River, Paradise Point had the highest number of human-related and human-specific E. faecalis and E. faecium SNP profiles. The secondary aim of this study was to determine the antibiotic-resistance profiles and virulence traits associated with environmental E. faecalis and E. faecium isolates compared to human pathogenic E. faecalis and E. faecium isolates. This was performed to predict the potential health risks associated with coming into contact with these strains in the Coomera watershed. In general, clinical isolates were found to be more resistant to all the antibiotics tested compared to water isolates and they harbored more virulence traits. Multi-drug resistance was more prevalent in clinical isolates (71.18% of E. faecalis and 70.3 % of E. faecium) compared to water isolates (only 5.66 % E. faecium). However, tetracycline, gentamicin, ciprofloxacin and ampicillin resistance was observed in water isolates. The virulence gene esp was the most prevalent virulence determinant observed in clinical isolates (67.79% of E. faecalis and 70.37 % of E. faecium), and this gene has been described as a human-specific marker used for microbial source tracking (MST). The presence of esp in water isolates (16.36% of E. faecalis and 19.14% of E. faecium) could be indicative of human faecal contamination in these waterways. Finally, in order to compare overall gene expression between environmental and clinical strains of E. faecalis, a comparative gene hybridization study was performed. The results of this investigation clearly demonstrated the up-regulation of genes associated with pathogenicity in E. faecalis isolated from water. The expression study was performed at physiological temperatures relative to ambient temperatures. The up-regulation of virulence genes demonstrates that environmental strains of E. faecalis can pose an increased health risk which can lead to serious disease, particularly if these strains belong to the virulent CC17 group. The genotyping techniques developed in this study not only provide a rapid, robust and highly discriminatory tool to characterize E. faecalis and E. faecium, but also enables the efficient identification of virulent enterococci that are distributed in environmental water sources.
Resumo:
We consider how data from scientific research should be used for decision making in health services. Whether a hand hygiene intervention to reduce risk of nosocomial infection should be widely adopted is the case study. Improving hand hygiene has been described as the most important measure to prevent nosocomial infection. 1 Transmission of microorganisms is reduced, and fewer infections arise, which leads to a reduction in mortality2 and cost savings.3 Implementing a hand hygiene program is itself costly, so the extra investment should be tested for cost-effectiveness.4,5 The first part of our commentary is about cost-effectiveness models and how they inform decision making for health services. The second part is about how data on the effectiveness of hand hygiene programs arising from scientific studies are used, and 2 points are made: the threshold for statistical inference of .05 used to judge effectiveness studies is not important for decision making,6,7 and potentially valuable evidence about effectiveness might be excluded by decision makers because it is deemed low quality.8 The ideas put forward will help researchers and health services decision makers to appraise scientific evidence in a more powerful way.