939 resultados para Blood-stream Infection
Resumo:
Toxocara vitulorum is a pathogenic nematode from the small intestine of very young buffalo calves. To understand the development of the inflammatory responses in the wall of the gut, samples of tissues were removed from the duodenum, jejunum and ileum of buffalo calves naturally infected with T. vitulorum during the beginning of the infection, at the peak of egg output, as well as during the periods of rejection of the worms and post-rejection. Two additional control groups of uninfected calves (by anti-helminthic therapy of their mothers and after the birth) were also necropsied on days 30 and 50 after birth. Blood samples were fortnightly collected from birth to 174 days post-birth. Blood smears were prepared and stained with Giemsa for eosinophils. The parasitological status of buffalo calves was evaluated through weekly fecal egg counts (EPG) from 1 to 106 days after birth, which revealed that T. vitulorum egg shedding started on day 11, reached the peak of the infection on day 49 and finally expelled the parasites between days 50 and 85 after birth. In the infected buffalo calves, the mast cell population increased significantly, by two-fold in the mucosa (villus-crypt unit (VCU)) of the duodenum and four-fold in the proximal jejunum; but these increases were statistically significant only at the peak of the infection. Although mast cell numbers increased in the mucosa of the ileum as well as in both the submucosal and muscle tissues of the duodenum, proximal jejunum and ileum, the data was not significantly different from the controls. Eosinophil numbers increased in the mucosa of the duodenum (two-five times higher than the control) and proximal jejunum (three-five-fold) during the period of the infection (beginning, peak and rejection). The relative numbers of eosinophils increased in the blood stream from the second to the seventh week. In conclusion, T. vitulorum infection elicited mastocytosis and tissue eosinophilia in the duodenum and proximal jejunum, as well as eosinophilia in the blood stream, during the beginning, at the peak and during the rejection of the worm. After the rejection of the worms, the numbers of these cells returned to normal levels suggesting that these cells may have a role in the process of rejection of T. vitulorum by the host. (C) 2003 Elsevier B.V. B.V. All rights reserved.
Resumo:
Using A/J mice, which are susceptible to Staphylococcus aureus, we sought to identify genetic determinants of susceptibility to S. aureus, and evaluate their function with regard to S. aureus infection. One QTL region on chromosome 11 containing 422 genes was found to be significantly associated with susceptibility to S. aureus infection. Of these 422 genes, whole genome transcription profiling identified five genes (Dcaf7, Dusp3, Fam134c, Psme3, and Slc4a1) that were significantly differentially expressed in a) S. aureus -infected susceptible (A/J) vs. resistant (C57BL/6J) mice and b) humans with S. aureus blood stream infection vs. healthy subjects. Three of these genes (Dcaf7, Dusp3, and Psme3) were down-regulated in susceptible vs. resistant mice at both pre- and post-infection time points by qPCR. siRNA-mediated knockdown of Dusp3 and Psme3 induced significant increases of cytokine production in S. aureus-challenged RAW264.7 macrophages and bone marrow derived macrophages (BMDMs) through enhancing NF-κB signaling activity. Similar increases in cytokine production and NF-κB activity were also seen in BMDMs from CSS11 (C57BL/6J background with chromosome 11 from A/J), but not C57BL/6J. These findings suggest that Dusp3 and Psme3 contribute to S. aureus infection susceptibility in A/J mice and play a role in human S. aureus infection.
Resumo:
Evaluation of the technical and diagnostic feasibility of commercial multiplex real-time polymerase chain reaction (PCR) for detection of blood stream infections in a cohort of intensive care unit (ICU) patients with severe sepsis, performed in addition to conventional blood cultures.
Resumo:
Background Guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) around the world vary greatly. Most institutions recommend the use of heparin to prevent occlusion, however there is debate regarding the need for heparin and evidence to suggest 0.9% sodium chloride (normal saline) may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased cost. Objectives To assess the clinical effects (benefits and harms) of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Search Methods The Cochrane Vascular Trials Search Co-ordinator searched the Specialised Register (last searched April 2015) and the Cochrane Register of Studies (Issue 3, 2015). We also searched the reference lists of retrieved trials. Selection criteria Randomised controlled trials that compared the efficacy of normal saline with heparin to prevent occlusion of long term CVCs in infants and children aged up to 18 years of age were included. We excluded temporary CVCs and peripherally inserted central catheters (PICC). Data Collection and Analysis Two review authors independently assessed trial inclusion criteria, trial quality and extracted data. Rate ratios were calculated for two outcome measures - occlusion of the CVC and central line-associated blood stream infection. Other outcome measures included duration of catheter placement, inability to withdraw blood from the catheter, use of urokinase or recombinant tissue plasminogen, incidence of removal or re-insertion of the catheter, or both, and other CVC-related complications such as dislocation of CVCs, other CVC site infections and thrombosis. Main Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin, however, between studies, all used different protocols for the standard and experimental arms with different concentrations of heparin and different frequency of flushes reported. In addition, not all studies reported on all outcomes. The quality of the evidence ranged from low to very low because there was no blinding, heterogeneity and inconsistency between studies was high and the confidence intervals were wide. CVC occlusion was assessed in all three trials (243 participants). We were able to pool the results of two trials for the outcomes of CVC occlusion and CVC-associated blood stream infection. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). The duration of catheter placement was reported to be similar between the two study arms, in one study (203 participants). Authors' Conclusions The review found that there was not enough evidence to determine the effects of intermittent flushing of heparin versus normal saline to prevent occlusion in long term central venous catheters in infants and children. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
Background Around the world, guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) vary greatly. To prevent occlusion, most institutions recommend the use of heparin when the CVC is not in use. However, there is debate regarding the need for heparin and evidence to suggest normal saline may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased costs. Objectives To assess the clinical effects (benefits and harms) of heparin versus normal saline to prevent occlusion in long-term central venous catheters in infants, children and adolescents. Design A Cochrane systematic review of randomised controlled trials was undertaken. - Data sources: The Cochrane Vascular Group Specialised Register (including MEDLINE, CINAHL, EMBASE and AMED) and the Cochrane Register of Studies were searched. Hand searching of relevant journals and reference lists of retrieved articles was also undertaken. - Review Methods: Data were extracted and appraisal undertaken. We included studies that compared the efficacy of normal saline with heparin to prevent occlusion. We excluded temporary CVCs and peripherally inserted central catheters. Rate ratios per 1000 catheter days were calculated for two outcomes, occlusion of the CVC, and CVC-associated blood stream infection. Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin. However, between studies, all used different protocols with various concentrations of heparin and frequency of flushes. The quality of the evidence ranged from low to very low. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). Conclusions It remains unclear whether heparin is necessary for CVC maintenance. More well-designed studies are required to understand this relatively simple, but clinically important question. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
Introdução: Infecções relacionadas à assistência de saúde (IRAS) representam hoje um dos principais desafios da qualidade do cuidado do paciente, principalmente em pacientes submetido a transplante de células tronco e hematopoiéticas (TCTH) O banho diário com a clorexidina (CHG) degermante a 2% tem sido proposto principalmente em unidades de terapia intensivas (UTIs) para diminuir a colonização bacteriana do paciente e assim diminuir IRAS. O objetivo deste estudo foi avaliar o impacto do banho com CHG degermante a 2% em unidade de internação de TCTH na incidência de infecção e colonização por patógenos multirresistentes e ainda avaliar seu impacto na sensibilidade das bactérias ao antisséptico. Métodos: Foi realizado um estudo quasi-experimental, com duração de 9 anos, com início em janeiro/2005 até dezembro/2013. A intervenção foi iniciada em agosto de 2009, sendo que os períodos pré e pós-intervenção tiveram duração de 4,5 anos. As taxas de IRAS, infecção por gram-negativos multirresistentes e infecção e colonização por enterococo resistente a vancomicina (VRE) foram avaliadas através de série temporal, para estudar o impacto da intervenção. As concentrações inibitórias mínimas (CIM) das bactérias para a CHG com e sem o inibidor de bomba de efluxo (CCCP) foram avaliadas nos dois períodos. Os genes de resistência a CHG foram estudados por meio da PCR e a clonalidade dos isolados por eletroforese em campo pulsátil. Resultados: Foi observada redução significativa na incidência de infecção e colonização de VRE na unidade no período pós-intervenção (p: 0,001). Essa taxa permaneceu estável em outras UTIs clínicas do hospital. Contudo as taxas de infecção por Gram negativos multirresistentes aumentou nos últimos anos na unidade. Não ocorreu diminuição na taxa de IRAS na unidade. As CIMs testadas de CHG aumentaram nas amostras de VRE e K. pneumoniae após o período de exposição ao antisséptico, com queda importante da CIM após o uso do CCCP, revelando ser a bomba de efluxo, um importante mecanismo de resistência à CHG. As amostras de A. baumannii e P. aeruginosa não apresentaram aumento da CIM após período de exposição à clorexidina. As bombas de efluxo Ade A, B e C estiveram presentes na maioria dos A. baumannii do grupo controle (66%). A bomba cepA foi encontrada em 67% de todas as K. pneumoniae testadas e em 44,5% das P. aeruginosas do grupo pré intervenção. Observamos uma relação positiva entre a presença da CepA nas amostras de K. pneumoniae e a resposta ao CCCP: de todas as 49 amostras CepA positivas 67,3% obtiveram redução do seu MIC em 4 diluições após adição do CCCP. A avaliação de clonalidade demonstrou padrão policlonal das amostras de VRE, K. pneumoniae e A. baumannii avaliadas. Em relação às amostras de P. aeruginosa foi observado que no período pós-intervenção ocorreu predominância de um clone com > 80% semelhança em 10 das 22 amostras avaliadas pelo dendrograma. Conclusões: O banho de clorexidina teve impacto na redução da incidência de infecção e colonização por VRE na unidade de TCTH, e não teve o mesmo impacto nas bactérias gram-negativas. Os mecanismos moleculares de resistência à clorexidina estão intimamente ligados à presença de bomba de efluxo, sendo provavelmente o principal mecanismo de resistência e tolerância das bactérias ao antisséptico
Resumo:
No Abstract
Resumo:
Malaria, a disease caused by Plasmodium, represents a major health problem with a still disconcertingly high mortality rate (655 000 malaria deaths were estimated by the World Health Organization in 2012), mainly in Africa [1]. After a bite by an infected Anopheles mosquito occurs, Plasmodium sporozoites reach their target organ, the liver, within minutes. After traversing several hepatocytes, the parasite invades a final one and establishes a parasitophorous vacuole, where it replicates exponentially generating thousands of infective merozoites, the red blood cell infectious forms that are released in the blood stream. The liver stage is the first obligatory phase of malaria infection and, although no symptoms are associated with it, it is absolutely crucial to the establishment of a successful infection.(...)
Resumo:
Objective: The objective of this study was to analyze the incidence of and risk factors for healthcare-associated infections (HAI) among hematopoietic stem cell transplantation (HSCT) patients, and the impact of such infections on mortality during hospitalization. Methods: We conducted a 9-year (2001-2009) retrospective cohort study including patients submitted to HSCT at a reference center in Sao Paulo, Brazil. The incidence of HAI was calculated using days of neutropenia as the denominator. Data were analyzed using EpiInfo 3.5.1. Results: Over the 9-year period there were 429 neutropenic HSCT patients, with a total of 6816 days of neutropenia. Bloodstream infections (BSI) were the most frequent infection, presenting in 80 (18.6%) patients, with an incidence of 11.7 per 1000 days of neutropenia. Most bacteremia was due to Gram-negative bacteria: 43 (53.8%) cases were caused by Gram-negative species, while 33 (41.2%) were caused by Gram-positive species, and four (5%) by fungal species. Independent risk factors associated with HAI were prolonged neutropenia (odds ratio (OR) 1.07, 95% confidence interval (CI) 1.04-1.10) and duration of fever (OR 1.20, 95% CI 1.12-1.30). Risk factors associated with death in multivariate analyses were age (OR 1.02, 95% CI 1.01-1.43), being submitted to an allogeneic transplant (OR 3.08, 95% CI 1.68-5.56), a microbiologically documented infection (OR 2.96, 95% CI 1.87-4.6), invasive aspergillosis disease (OR 2.21, 95% CI 1.1-4.3), and acute leukemias (OR 2.24, 95% CI 1.3-3.6). Conclusions: BSI was the most frequent HAI, and there was a predominance of Gram-negative microorganisms. Independent risk factors associated with HAI were duration of neutropenia and fever, and the risk factors for a poor outcome were older age, type of transplant (allogeneic), the presence of a microbiologically documented infection, invasive aspergillosis, and acute leukemia. Further prospective studies with larger numbers of patients may confirm the role of these risk factors for a poor clinical outcome and death in this transplant population. (C) 2012 Published by Elsevier Ltd on behalf of International Society for Infectious Diseases.
Resumo:
Introduction: The aim of this work was to identify possible lymphatic filariasis foci in the western Brazilian Amazonian that could be established from the reports of Rachou in the 1950s. The study was conducted in three cities of the western Brazilian Amazon region - Porto Velho and Guajará-Mirim (State of Rondônia) and Humaitá (State of Amazonas). Methods: For human infection evaluation thick blood smear stained with Giemsa was used to analyze samples collected from 10pm to 1am. Polymerase chain reaction (PCR) was used to examine mosquito vectors for the presence of Wuchereria bancrofti DNA. Humans were randomly sampled from night schools students and from inhabitants in neighborhoods lacking sanitation. Mosquitoes were collected from residences only. Results: A total 2,709 night students enrolled in the Program for Education of Young Adults (EJA), and 935 people registered in the residences near the schools were examined, being 641 from Porto Velho, 214 from Guajará-Mirim and 80 from Humaitá. No individual examined was positive for the presence of microfilariae in the blood stream. A total of 7,860 female Culex quinquefasciatus specimens examined were negative by PCR. Conclusions: This survey including human and mosquito examinations indicates that the western Amazon region of Brazil is not a focus of Bancroftian filariasis infection or transmission. Therefore, there is no need to be included in the Brazilian lymphatic filariasis control program.
Resumo:
Until recently, the low-abundance (LA) range of the serum proteome was an unexplored reservoir of diagnostic information. Today it is increasingly appreciated that a diagnostic goldmine of LA biomarkers resides in the blood stream in complexed association with more abundant higher molecular weight carrier proteins such as albumin and immunoglobulins. As we now look to the possibility of harvesting these LA biomarkers more efficiently through engineered nano-scale particles, mathematical approaches are needed in order to reveal the mechanisms by which blood carrier proteins act as molecular 'mops' for LA diagnostic cargo, and the functional relationships between bound LA biomarker concentrations and other variables of interest such as biomarker intravasation and clearance rates and protein half-lives in the bloodstream. Here we show, by simple mathematical modeling, how the relative abundance of large carrier proteins and their longer half-lives in the bloodstream work together to amplify the total blood concentration of these tiny biomarkers. The analysis further suggests that alterations in the production of biomarkers lead to gradual rather than immediate changes in biomarker levels in the blood circulation. The model analysis also points to the characteristics of artificial nano-particles that would render them more efficient harvesters of tumor biomarkers in the circulation, opening up possibilities for the early detection of curable disease, rather than simply better detection of advanced disease.
Resumo:
Candida species are an important cause of nosocomial bloodstream infections in hospitalized patients worldwide, with associated high mortality, excess length of stay and costs. Main contributors to candidemias is profound immunosuppression due to serious underlying condition or intensive treatments leading to an increasing number of susceptible patients. The rank order of causative Candida species varies over time and in different geographic locations. The aim of this study was to obtain information on epidemiology of candidemia in Finland, to identify trends in incidence, causative species, and patient populations at risk. In order to reveal possible outbreaks and assess the value of one molecular typing method, restriction enzyme analysis (REA), in epidemiological study, we analyzed C. albicans bloodstream isolates in Uusimaa region in Southern Finland during eight years. The data from the National Infectious Disease Register were used to assess the incidence and epidemiological features of candidemia cases. In Helsinki University Central Hospital (HUCH) all patients with blood culture yielding any Candida spp. were identified from laboratory log-books and from Finnish Hospital Infection Program. All the patients with a stored blood culture isolate of C. albicans were identified through microbiology laboratory logbooks, and stored isolates were genotyped with REA in the National Institute for Health and Welfare (former KTL). The incidence of candidemia in Finland is globally relatively low, but increased between between 1990s and 2000s. The incidence was highest in males >65 years of age, but incidence rates for patients <1-15 years were lower during 2000s than during 1990s. In HUCH the incidence of candidemia remained low and constant during our 18 years of observation, but a significant shift in patient-populations at risk was observed, associated with patients treated in intensive care units, such as premature neonates and surgical patients. The predominating causative species in Finland and in HUCH is C. albicans, but the proportion of C. glabrata increased considerably. The crude one-month case fatality was constantly high between 28-33%. REA differentiated efficiently between C. albicans blood culture isolates and no clusters were observed in the hospitals involved, despite of abundant transfer of patients among them. Candida spp. are an important cause of nosocomial blood stream infections in Finland, and continued surveillance is necessary to determine the overall trends and patient groups at risk, and reduce the impact of these infections in the future. Molecular methods provide an efficient tool for investigation of suspected outbreak and should be available in the future in Finland, also.