93 resultados para Opportunistic Infections
Resumo:
Chlamydia trachomatis is the leading cause of bacterial sexually transmitted disease worldwide resulting in 4–5 million new cases of Chlamydia annually and an estimated 100 million cases per annum. Infections of the lower female genital tract (FGT) frequently are asymptomatic so they often remain undiagnosed or untreated. If infections are either not resolved, or are left untreated, chlamydia can ascend to the upper FGT and infect the fallopian tubes (FTs) causing salpingitis that may lead to functional damage of the FTs and tubal factor infertility (TFI). Clinical observations and experimental data have indicated a role for antibodies against C. trachomatis proteins such as the 60 kDa heat-shock protein 60 (cHSP60) in the immunopathogenesis of TFI. When released from infected cells cHSP60 can induce pro-inflammatory immune responses that may functionally impair the FTs leading to fibrosis and luminal occlusion. Chlamydial pathogenesis of irreversible and permanent tubal damage is a consequence of innate and adaptive host immune responses to ongoing or repeated infections. The extracellular matrix (ECM) that is regulated by metalloproteinases (MMPs) may also be modified by chlamydial infections of the FGT. This review will highlight protective and pathogenic immune responses to ongoing and repeated chlamydial infections of the FGT. It will also present two recent hypotheses to explain mechanisms that may contribute to FT damage during a C. trachomatis infection. If Chlamydia immunopathology can be controlled it might yield a method of inducing fibrosis and thus provide a means of non-surgical permanent contraception for women.
Resumo:
Enterococcus faecalis is a Gram-positive, coccus shaped, lactic acid bacterium, with demonstrated ubiquity across multiple anatomical sites. Enterococcus faecalis isolates have been isolated from clinical samples as the etiological agent in patients with overt infections, and from body sites previously thought to be sterile but absent of signs and symptoms of infection. E. faecalis is implicated in both human health and disease, recognized as a commensal, a probiotic and an opportunistic multiply resistant pathogen. E. faecalis has emerged as a key pathogen in nosocomial infections. E. faecalis is well equipped to avert recognition by host cell immune mediators. Antigenic cell wall components including lipotechoic acids are concealed from immune detection by capsular polysaccharides produced by some strains. Thereby preventing complement activation, the pro-inflammatory response, opsonisation and phagocytosis. E. faecalis also produces a suite of enzymes including gelatinase and cytolysin, which aid in both virulence and host immune evasion. The ability of enterococci to form biofilms in vivo further increases virulence, whilst simultaneously preventing detection by host cells. E. faecalis exhibits high levels of both intrinsic and acquired antimicrobial resistance. The mobility of the E. faecalis genome is a significant contributor to antimicrobial resistance, with this species also transferring resistance to other Gram-positive bacteria. Whilst E. faecalis is of increasing concern in nosocomial infections, its role as a member of the endogenous microbiota cannot be underestimated. As a commensal and probiotic, E. faecalis plays an integral role in modulating the immune response, and in providing endogenous antimicrobial activity to enhance exclusion or inhibition of opportunistic pathogens in certain anatomical niches. In this chapter we will review possible mediators of enterococcal transition from commensal microbe to opportunistic pathogen, considering isolates obtained from patients diagnosed with pathogenic infections and those obtained from asymptomatic patients.
Resumo:
IgA is an important mucosal antibody that can neutralize mucosal pathogens by either preventing attachment to epithelia (immune exclusion) or alternatively inhibit intraepithelial replication following transcytosis by the polymeric immunoglobulin receptor (pIgR). Chlamydia trachomatis is a major human pathogen that initially targets the endocervical or urethral epithelium in women and men, respectively. As both tissues contain abundant SIgA we assessed the protection afforded by IgA targeting different chlamydial antigens expressed during the extra and intraepithelial stages of infection. We developed an in vitro model utilizing polarizing cells expressing the murine pIgR together with antigen-specific mouse IgA, and an in vivo model utilizing pIgR-/- mice. SIgA targeting the extraepithelial chlamydial antigen, the major outer membrane protein (MOMP), significantly reduced infection in vitro by 24 % and in vivo by 44 %. Conversely, pIgR-mediated delivery of IgA targeting the intraepithelial inclusion membrane protein A (IncA) bound to the inclusion but did not reduce infection in vitro or in vivo. Similarly, intraepithelial IgA targeting the secreted protease Chlamydia protease-like activity factor (CPAF) also failed to reduce infection. Together, these data suggest the importance of pIgR-mediated delivery of IgA targeting extra but not intraepithelial chlamydial antigens for protection against a genital tract infection.
Resumo:
This study aimed to investigate the spatial clustering and dynamic dispersion of dengue incidence in Queensland, Australia. We used Moran's I statistic to assess the spatial autocorrelation of reported dengue cases. Spatial empirical Bayes smoothing estimates were used to display the spatial distribution of dengue in postal areas throughout Queensland. Local indicators of spatial association (LISA) maps and logistic regression models were used to identify spatial clusters and examine the spatio-temporal patterns of the spread of dengue. The results indicate that the spatial distribution of dengue was clustered during each of the three periods of 1993–1996, 1997–2000 and 2001–2004. The high-incidence clusters of dengue were primarily concentrated in the north of Queensland and low-incidence clusters occurred in the south-east of Queensland. The study concludes that the geographical range of notified dengue cases has significantly expanded in Queensland over recent years.
Resumo:
Increased frequency of eating in the absence of homeostatic need, notably through snacking, is an important contributor to overconsumption and may be facilitated by increased availability of palatable food in the obesogenic environment. Opportunistic initiation of snacking is likely to be subject to individual differences, although these are infrequently studied in laboratory-based research paradigms. This study examined psychological factors associated with opportunistic initiation of snacking, and predictors of intake in the absence of homeostatic need. Fifty adults (mean age 34.5 years, mean BMI 23.9 kg/m2, 56% female) participated in a snack taste test in which they ate a chocolate snack to satiation, after which they were offered an unanticipated opportunity to initiate a second eating episode. Trait and behavioural measures of self control, sensitivity to reward, dietary restraint and disinhibited eating were taken. Results showed that, contrary to expectations, those who initiated snacking were better at inhibitory control compared with those who did not initiate. However, amongst participants who initiated snacking, intake (kcal) was predicted by higher food reward sensitivity, impulsivity and BMI. These findings suggest that snacking initiation in the absence of hunger is an important contributor to overconsumption. Consideration of the individual differences promoting initiation of eating may aid in reducing elevated eating frequency in at-risk individuals.
Resumo:
This project expands upon the discovery that scabies mites produce protein molecules that interfere with the human complement cascade, disrupting a critical component of the early stages of the host immune response. This is believed to provide an optimal environment for the development of commonly associated secondary bacterial infections. The thesis investigated the effect of two distinct scabies mite proteins, namely SMS B4 and SMIPP-S I1, on the in vitro proliferation of Group A Streptococcus in whole human blood. Additionally, in vitro immunoassays were performed to determine if complement mediated opsonisation and phagocytosis were also disrupted.
Resumo:
Asthma prevalence in children has remained relatively constant in many Western countries, but hospital admissions for younger age groups have increased over time.1 Although the role of outdoor aeroallergens as triggers for asthma exacerbations requiring hospitalization in children and adolescents is complex, there is evidence that increasing concentrations of grass pollen are associated with an increased risk of asthma exacerbations in children.2 Human rhinovirus (HRV) infections are implicated in most of the serious asthma exacerbations in school-age children.3 In previous research, HRV infections and aeroallergen exposure have usually been studied independently. To our knowledge, only 1 study has examined interactions between these 2 factors,4 but lack of power prevented any meaningful interpretation...
Resumo:
Background Australia has commenced public reporting and benchmarking of healthcare associated infections (HAIs), despite not having a standardised national HAI surveillance program. Annual hospital Staphylococcus aureus bloodstream (SAB) infection rates are released online, with other HAIs likely to be reported in the future. Although there are known differences between hospitals in Australian HAI surveillance programs, the effect of these differences on reported HAI rates is not known. Objective To measure the agreement in HAI identification, classification, and calculation of HAI rates, and investigate the influence of differences amongst those undertaking surveillance on these outcomes. Methods A cross-sectional online survey exploring HAI surveillance practices was administered to infection prevention nurses who undertake HAI surveillance. Seven clinical vignettes describing HAI scenarios were included to measure agreement in HAI identification, classification, and calculation of HAI rates. Data on characteristics of respondents was also collected. Three of the vignettes were related to surgical site infection and four to bloodstream infection. Agreement levels for each of the vignettes were calculated. Using the Australian SAB definition, and the National Health and Safety Network definitions for other HAIs, we looked for an association between the proportion of correct answers and the respondents’ characteristics. Results Ninety-two infection prevention nurses responded to the vignettes. One vignette demonstrated 100 % agreement from responders, whilst agreement for the other vignettes varied from 53 to 75 %. Working in a hospital with more than 400 beds, working in a team, and State or Territory was associated with a correct response for two of the vignettes. Those trained in surveillance were more commonly associated with a correct response, whilst those working part-time were less likely to respond correctly. Conclusion These findings reveal the need for further HAI surveillance support for those working part-time and in smaller facilities. It also confirms the need to improve uniformity of HAI surveillance across Australian hospitals, and raises questions on the validity of the current comparing of national HAI SAB rates.
Resumo:
Background Bloodstream infections resulting from intravascular catheters (catheter-BSI) in critical care increase patients' length of stay, morbidity and mortality, and the management of these infections and their complications has been estimated to cost the NHS annually £19.1–36.2M. Catheter-BSI are thought to be largely preventable using educational interventions, but guidance as to which types of intervention might be most clinically effective is lacking. Objective To assess the effectiveness and cost-effectiveness of educational interventions for preventing catheter-BSI in critical care units in England. Data sources Sixteen electronic bibliographic databases – including MEDLINE, MEDLINE In-Process & Other Non-Indexed Citations, Cumulative Index to Nursing and Allied Health Literature (CINAHL), NHS Economic Evaluation Database (NHS EED), EMBASE and The Cochrane Library databases – were searched from database inception to February 2011, with searches updated in March 2012. Bibliographies of systematic reviews and related papers were screened and experts contacted to identify any additional references. Review methods References were screened independently by two reviewers using a priori selection criteria. A descriptive map was created to summarise the characteristics of relevant studies. Further selection criteria developed in consultation with the project Advisory Group were used to prioritise a subset of studies relevant to NHS practice and policy for systematic review. A decision-analytic economic model was developed to investigate the cost-effectiveness of educational interventions for preventing catheter-BSI. Results Seventy-four studies were included in the descriptive map, of which 24 were prioritised for systematic review. Studies have predominantly been conducted in the USA, using single-cohort before-and-after study designs. Diverse types of educational intervention appear effective at reducing the incidence density of catheter-BSI (risk ratios statistically significantly < 1.0), but single lectures were not effective. The economic model showed that implementing an educational intervention in critical care units in England would be cost-effective and potentially cost-saving, with incremental cost-effectiveness ratios under worst-case sensitivity analyses of < £5000/quality-adjusted life-year. Limitations Low-quality primary studies cannot definitively prove that the planned interventions were responsible for observed changes in catheter-BSI incidence. Poor reporting gave unclear estimates of risk of bias. Some model parameters were sourced from other locations owing to a lack of UK data. Conclusions Our results suggest that it would be cost-effective and may be cost-saving for the NHS to implement educational interventions in critical care units. However, more robust primary studies are needed to exclude the possible influence of secular trends on observed reductions in catheter-BSI.
Resumo:
Chlamydial infections of fish are emerging as an important cause of disease in new and established aquaculture industries. To date, epitheliocystis, a skin and gill disease associated with infection by these obligate intracellular pathogens, has been described in over 90 fish species, including hosts from marine and fresh water environments. Aided by advances in molecular detection and typing, recent years have seen an explosion in the description of these epitheliocystis-related chlamydial pathogens of fish, significantly broadening our knowledge of the genetic diversity of the order Chlamydiales. Remarkably, in most cases, it seems that each new piscine host studied has revealed the presence of a phylogenetically unique and novel chlamydial pathogen, providing researchers with a fascinating opportunity to understand the origin, evolution and adaptation of their traditional terrestrial chlamydial relatives. Despite the advances in this area, much still needs to be learnt about the epidemiology of chlamydial infections in fish if these pathogens are to be controlled in farmed environments. The lack of in vitro methods for culturing of chlamydial pathogens of fish is a major hindrance to this field. This review provides an update on our current knowledge of the taxonomy and diversity of chlamydial pathogens of fish, discusses the impact of these infections on the health, and highlights further areas of research required to understand the biology and epidemiology of this important emerging group of fish pathogens of aquaculture species.
Resumo:
While virulence factors and the biofilm-forming capabilities of microbes are the key regulators of the wound healing process, the host immune response may also contribute in the events following wound closure or exacerbation of non-closure. We examined samples from diabetic and non-diabetic foot ulcers/wounds for microbial association and tested the microbes for their antibiotic susceptibility and ability to produce biofilms. A total of 1074 bacterial strains were obtained with staphylococci, Pseudomonas, Citrobacter and enterococci as major colonizers in diabetic samples. Though non-diabetic samples had a similar assemblage, the frequency of occurrence of different groups of bacteria was different. Gram-negative bacteria were found to be more prevalent in the diabetic wound environment while Gram-positive bacteria were predominant in non-diabetic ulcers. A higher frequency of monomicrobial infection was observed in samples from non-diabetic individuals when compared to samples from diabetic patients. The prevalence of different groups of bacteria varied when the samples were stratified according to age and sex of the individuals. Several multidrug-resistant strains were observed among the samples tested and most of these strains produced moderate to high levels of biofilms. The weakened immune response in diabetic individuals and synergism among pathogenic micro-organisms may be the critical factors that determine the delicate balance of the wound healing process.
Resumo:
We consider estimating the total load from frequent flow data but less frequent concentration data. There are numerous load estimation methods available, some of which are captured in various online tools. However, most estimators are subject to large biases statistically, and their associated uncertainties are often not reported. This makes interpretation difficult and the estimation of trends or determination of optimal sampling regimes impossible to assess. In this paper, we first propose two indices for measuring the extent of sampling bias, and then provide steps for obtaining reliable load estimates that minimizes the biases and makes use of informative predictive variables. The key step to this approach is in the development of an appropriate predictive model for concentration. This is achieved using a generalized rating-curve approach with additional predictors that capture unique features in the flow data, such as the concept of the first flush, the location of the event on the hydrograph (e.g. rise or fall) and the discounted flow. The latter may be thought of as a measure of constituent exhaustion occurring during flood events. Forming this additional information can significantly improve the predictability of concentration, and ultimately the precision with which the pollutant load is estimated. We also provide a measure of the standard error of the load estimate which incorporates model, spatial and/or temporal errors. This method also has the capacity to incorporate measurement error incurred through the sampling of flow. We illustrate this approach for two rivers delivering to the Great Barrier Reef, Queensland, Australia. One is a data set from the Burdekin River, and consists of the total suspended sediment (TSS) and nitrogen oxide (NO(x)) and gauged flow for 1997. The other dataset is from the Tully River, for the period of July 2000 to June 2008. For NO(x) Burdekin, the new estimates are very similar to the ratio estimates even when there is no relationship between the concentration and the flow. However, for the Tully dataset, by incorporating the additional predictive variables namely the discounted flow and flow phases (rising or recessing), we substantially improved the model fit, and thus the certainty with which the load is estimated.
Resumo:
In their recent Review, Walter Zingg and colleagues1 presented the findings of a mixed methods systematic review done to describe the most effective elements of infection control programmes. We believe the inclusion of both qualitative and quantitative research in this Article is commendable, particularly because qualitative research contributes important context for clinicians, researchers, and policy makers when designing, implementing, and assessing interventions. However, in view of the large scope covered by the systematic review, and difficulties associated with a mixed methods synthesis approach,2 we would like to seek further information from the authors...
Resumo:
Background People admitted to intensive care units and those with chronic health care problems often require long-term vascular access. Central venous access devices (CVADs) are used for administering intravenous medications and blood sampling. CVADs are covered with a dressing and secured with an adhesive or adhesive tape to protect them from infection and reduce movement. Dressings are changed when they become soiled with blood or start to come away from the skin. Repeated removal and application of dressings can cause damage to the skin. The skin is an important barrier that protects the body against infection. Less frequent dressing changes may reduce skin damage, but it is unclear whether this practice affects the frequency of catheter-related infections. Objectives To assess the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections and other outcomes including pain and skin damage. Search methods In June 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE and EBSCO CINAHL. We also searched clinical trials registries for registered trials. There were no restrictions with respect to language, date of publication or study setting. Selection criteria All randomised controlled trials (RCTs) evaluating the effect of the frequency of CVAD dressing changes on the incidence of catheter-related infections on all patients in any healthcare setting. Data collection and analysis We used standard Cochrane review methodology. Two review authors independently assessed studies for inclusion, performed risk of bias assessment and data extraction. We undertook meta-analysis where appropriate or otherwise synthesised data descriptively when heterogeneous. Main results We included five RCTs (2277 participants) that compared different frequencies of CVAD dressing changes. The studies were all conducted in Europe and published between 1995 and 2009. Participants were recruited from the intensive care and cancer care departments of one children's and four adult hospitals. The studies used a variety of transparent dressings and compared a longer interval between dressing changes (5 to15 days; intervention) with a shorter interval between changes (2 to 5 days; control). In each study participants were followed up until the CVAD was removed or until discharge from ICU or hospital. - Confirmed catheter-related bloodstream infection (CRBSI) One trial randomised 995 people receiving central venous catheters to a longer or shorter interval between dressing changes and measured CRBSI. It is unclear whether there is a difference in the risk of CRBSI between people having long or short intervals between dressing changes (RR 1.42, 95% confidence interval (CI) 0.40 to 4.98) (low quality evidence). - Suspected catheter-related bloodstream infection Two trials randomised a total of 151 participants to longer or shorter dressing intervals and measured suspected CRBSI. It is unclear whether there is a difference in the risk of suspected CRBSI between people having long or short intervals between dressing changes (RR 0.70, 95% CI 0.23 to 2.10) (low quality evidence). - All cause mortality Three trials randomised a total of 896 participants to longer or shorter dressing intervals and measured all cause mortality. It is unclear whether there is a difference in the risk of death from any cause between people having long or short intervals between dressing changes (RR 1.06, 95% CI 0.90 to 1.25) (low quality evidence). - Catheter-site infection Two trials randomised a total of 371 participants to longer or shorter dressing intervals and measured catheter-site infection. It is unclear whether there is a difference in risk of catheter-site infection between people having long or short intervals between dressing changes (RR 1.07, 95% CI 0.71 to 1.63) (low quality evidence). - Skin damage One small trial (112 children) and three trials (1475 adults) measured skin damage. There was very low quality evidence for the effect of long intervals between dressing changes on skin damage compared with short intervals (children: RR of scoring ≥ 2 on the skin damage scale 0.33, 95% CI 0.16 to 0.68; data for adults not pooled). - Pain Two studies involving 193 participants measured pain. It is unclear if there is a difference between long and short interval dressing changes on pain during dressing removal (RR 0.80, 95% CI 0.46 to 1.38) (low quality evidence). Authors' conclusions The best available evidence is currently inconclusive regarding whether longer intervals between CVAD dressing changes are associated with more or less catheter-related infection, mortality or pain than shorter intervals.